Apr 23 03:42:00 user nova-compute[71428]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. Apr 23 03:42:03 user nova-compute[71428]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=71428) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=71428) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=71428) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} Apr 23 03:42:03 user nova-compute[71428]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.020s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:42:03 user nova-compute[71428]: INFO nova.virt.driver [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] Loading compute driver 'libvirt.LibvirtDriver' Apr 23 03:42:03 user nova-compute[71428]: INFO nova.compute.provider_config [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] Acquiring lock "singleton_lock" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] Acquired lock "singleton_lock" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] Releasing lock "singleton_lock" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] Full set of CONF: {{(pid=71428) _wait_for_exit_or_signal /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:362}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] ******************************************************************************** {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] Configuration options gathered from: {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] command line args: ['--config-file', '/etc/nova/nova-cpu.conf'] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] config files: ['/etc/nova/nova-cpu.conf'] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] ================================================================================ {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] allow_resize_to_same_host = True {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] arq_binding_timeout = 300 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] backdoor_port = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] backdoor_socket = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] block_device_allocate_retries = 300 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] block_device_allocate_retries_interval = 5 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cert = self.pem {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] compute_driver = libvirt.LibvirtDriver {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] compute_monitors = [] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] config_dir = [] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] config_drive_format = iso9660 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] config_file = ['/etc/nova/nova-cpu.conf'] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] config_source = [] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] console_host = user {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] control_exchange = nova {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cpu_allocation_ratio = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] daemon = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] debug = True {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] default_access_ip_network_name = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] default_availability_zone = nova {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] default_ephemeral_format = ext4 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] default_schedule_zone = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] disk_allocation_ratio = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] enable_new_services = True {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] enabled_apis = ['osapi_compute'] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] enabled_ssl_apis = [] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] flat_injected = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] force_config_drive = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] force_raw_images = True {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] graceful_shutdown_timeout = 5 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] heal_instance_info_cache_interval = 60 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] host = user {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] initial_cpu_allocation_ratio = 4.0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] initial_disk_allocation_ratio = 1.0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] initial_ram_allocation_ratio = 1.0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] instance_build_timeout = 0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] instance_delete_interval = 300 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] instance_format = [instance: %(uuid)s] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] instance_name_template = instance-%08x {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] instance_usage_audit = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] instance_usage_audit_period = month {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] instances_path = /opt/stack/data/nova/instances {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] internal_service_availability_zone = internal {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] key = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] live_migration_retry_count = 30 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] log_config_append = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] log_dir = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] log_file = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] log_options = True {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] log_rotate_interval = 1 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] log_rotate_interval_type = days {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] log_rotation_type = none {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] long_rpc_timeout = 1800 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] max_concurrent_builds = 10 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] max_concurrent_live_migrations = 1 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] max_concurrent_snapshots = 5 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] max_local_block_devices = 3 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] max_logfile_count = 30 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] max_logfile_size_mb = 200 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] maximum_instance_delete_attempts = 5 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] metadata_listen = 0.0.0.0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] metadata_listen_port = 8775 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] metadata_workers = 3 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] migrate_max_retries = -1 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] mkisofs_cmd = genisoimage {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] my_block_storage_ip = 10.0.0.210 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] my_ip = 10.0.0.210 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] network_allocate_retries = 0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] osapi_compute_listen = 0.0.0.0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] osapi_compute_listen_port = 8774 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] osapi_compute_unique_server_name_scope = {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] osapi_compute_workers = 3 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] password_length = 12 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] periodic_enable = True {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] periodic_fuzzy_delay = 60 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] pointer_model = ps2mouse {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] preallocate_images = none {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] publish_errors = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] pybasedir = /opt/stack/nova {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] ram_allocation_ratio = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] rate_limit_burst = 0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] rate_limit_except_level = CRITICAL {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] rate_limit_interval = 0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] reboot_timeout = 0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] reclaim_instance_interval = 0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] record = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] reimage_timeout_per_gb = 20 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] report_interval = 10 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] rescue_timeout = 0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] reserved_host_cpus = 0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] reserved_host_disk_mb = 0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] reserved_host_memory_mb = 512 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] reserved_huge_pages = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] resize_confirm_window = 0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] resize_fs_using_block_device = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] resume_guests_state_on_host_boot = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] rpc_response_timeout = 60 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] run_external_periodic_tasks = True {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] running_deleted_instance_action = reap {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] running_deleted_instance_poll_interval = 1800 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] running_deleted_instance_timeout = 0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] scheduler_instance_sync_interval = 120 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] service_down_time = 60 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] servicegroup_driver = db {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] shelved_offload_time = 0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] shelved_poll_interval = 3600 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] shutdown_timeout = 0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] source_is_ipv6 = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] ssl_only = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] state_path = /opt/stack/data/nova {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] sync_power_state_interval = 600 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] sync_power_state_pool_size = 1000 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] syslog_log_facility = LOG_USER {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] tempdir = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] timeout_nbd = 10 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] transport_url = **** {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] update_resources_interval = 0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] use_cow_images = True {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] use_eventlog = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] use_journal = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] use_json = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] use_rootwrap_daemon = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] use_stderr = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] use_syslog = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vcpu_pin_set = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vif_plugging_is_fatal = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vif_plugging_timeout = 0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] virt_mkfs = [] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] volume_usage_poll_interval = 0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] watch_log_file = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] web = /usr/share/spice-html5 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_concurrency.disable_process_locking = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_concurrency.lock_path = /opt/stack/data/nova {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_metrics.metrics_process_name = {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api.auth_strategy = keystone {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api.compute_link_prefix = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api.dhcp_domain = novalocal {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api.enable_instance_password = True {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api.glance_link_prefix = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:03 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api.instance_list_cells_batch_strategy = distributed {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api.instance_list_per_project_cells = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api.list_records_by_skipping_down_cells = True {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api.local_metadata_per_cell = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api.max_limit = 1000 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api.metadata_cache_expiration = 15 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api.neutron_default_tenant_id = default {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api.use_forwarded_for = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api.use_neutron_default_nets = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api.vendordata_dynamic_failure_fatal = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api.vendordata_dynamic_ssl_certfile = {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api.vendordata_dynamic_targets = [] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api.vendordata_jsonfile_path = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api.vendordata_providers = ['StaticJSON'] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cache.backend = dogpile.cache.memcached {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cache.backend_argument = **** {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cache.config_prefix = cache.oslo {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cache.dead_timeout = 60.0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cache.debug_cache_backend = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cache.enable_retry_client = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cache.enable_socket_keepalive = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cache.enabled = True {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cache.expiration_time = 600 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cache.hashclient_retry_attempts = 2 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cache.hashclient_retry_delay = 1.0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cache.memcache_dead_retry = 300 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cache.memcache_password = {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cache.memcache_pool_maxsize = 10 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cache.memcache_pool_unused_timeout = 60 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cache.memcache_sasl_enabled = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cache.memcache_servers = ['localhost:11211'] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cache.memcache_socket_timeout = 1.0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cache.memcache_username = {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cache.proxies = [] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cache.retry_attempts = 2 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cache.retry_delay = 0.0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cache.socket_keepalive_count = 1 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cache.socket_keepalive_idle = 1 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cache.socket_keepalive_interval = 1 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cache.tls_allowed_ciphers = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cache.tls_cafile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cache.tls_certfile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cache.tls_enabled = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cache.tls_keyfile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cinder.auth_section = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cinder.auth_type = password {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cinder.cafile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cinder.catalog_info = volumev3::publicURL {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cinder.certfile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cinder.collect_timing = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cinder.cross_az_attach = True {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cinder.debug = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cinder.endpoint_template = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cinder.http_retries = 3 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cinder.insecure = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cinder.keyfile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cinder.os_region_name = RegionOne {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cinder.split_loggers = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cinder.timeout = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] compute.cpu_dedicated_set = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] compute.cpu_shared_set = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] compute.image_type_exclude_list = [] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] compute.live_migration_wait_for_vif_plug = True {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] compute.max_concurrent_disk_ops = 0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] compute.max_disk_devices_to_attach = -1 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] compute.resource_provider_association_refresh = 300 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] compute.shutdown_retry_interval = 10 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] conductor.workers = 3 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] console.allowed_origins = [] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] console.ssl_ciphers = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] console.ssl_minimum_version = default {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] consoleauth.token_ttl = 600 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cyborg.cafile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cyborg.certfile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cyborg.collect_timing = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cyborg.connect_retries = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cyborg.connect_retry_delay = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cyborg.endpoint_override = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cyborg.insecure = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cyborg.keyfile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cyborg.max_version = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cyborg.min_version = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cyborg.region_name = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cyborg.service_name = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cyborg.service_type = accelerator {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cyborg.split_loggers = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cyborg.status_code_retries = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cyborg.status_code_retry_delay = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cyborg.timeout = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] cyborg.version = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] database.backend = sqlalchemy {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] database.connection = **** {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] database.connection_debug = 0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] database.connection_parameters = {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] database.connection_recycle_time = 3600 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] database.connection_trace = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] database.db_inc_retry_interval = True {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] database.db_max_retries = 20 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] database.db_max_retry_interval = 10 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] database.db_retry_interval = 1 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] database.max_overflow = 50 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] database.max_pool_size = 5 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] database.max_retries = 10 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] database.mysql_enable_ndb = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] database.mysql_sql_mode = TRADITIONAL {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] database.mysql_wsrep_sync_wait = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] database.pool_timeout = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] database.retry_interval = 10 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] database.slave_connection = **** {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] database.sqlite_synchronous = True {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api_database.backend = sqlalchemy {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api_database.connection = **** {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api_database.connection_debug = 0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api_database.connection_parameters = {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api_database.connection_recycle_time = 3600 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api_database.connection_trace = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api_database.db_inc_retry_interval = True {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api_database.db_max_retries = 20 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api_database.db_max_retry_interval = 10 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api_database.db_retry_interval = 1 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api_database.max_overflow = 50 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api_database.max_pool_size = 5 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api_database.max_retries = 10 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api_database.mysql_enable_ndb = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api_database.mysql_wsrep_sync_wait = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api_database.pool_timeout = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api_database.retry_interval = 10 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api_database.slave_connection = **** {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] api_database.sqlite_synchronous = True {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] devices.enabled_mdev_types = [] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] ephemeral_storage_encryption.enabled = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] ephemeral_storage_encryption.key_size = 512 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] glance.api_servers = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] glance.cafile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] glance.certfile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] glance.collect_timing = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] glance.connect_retries = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] glance.connect_retry_delay = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] glance.debug = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] glance.default_trusted_certificate_ids = [] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] glance.enable_certificate_validation = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] glance.enable_rbd_download = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] glance.endpoint_override = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] glance.insecure = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] glance.keyfile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] glance.max_version = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] glance.min_version = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] glance.num_retries = 3 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] glance.rbd_ceph_conf = {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] glance.rbd_connect_timeout = 5 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] glance.rbd_pool = {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] glance.rbd_user = {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] glance.region_name = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] glance.service_name = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] glance.service_type = image {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] glance.split_loggers = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] glance.status_code_retries = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] glance.status_code_retry_delay = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] glance.timeout = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] glance.verify_glance_signatures = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] glance.version = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] guestfs.debug = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] hyperv.config_drive_cdrom = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] hyperv.config_drive_inject_password = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] hyperv.enable_instance_metrics_collection = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] hyperv.enable_remotefx = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] hyperv.instances_path_share = {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] hyperv.iscsi_initiator_list = [] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] hyperv.limit_cpu_features = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] hyperv.power_state_check_timeframe = 60 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] hyperv.power_state_event_polling_interval = 2 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] hyperv.use_multipath_io = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] hyperv.volume_attach_retry_count = 10 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] hyperv.volume_attach_retry_interval = 5 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] hyperv.vswitch_name = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] mks.enabled = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] image_cache.manager_interval = 2400 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] image_cache.precache_concurrency = 1 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] image_cache.remove_unused_base_images = True {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] image_cache.subdirectory_name = _base {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] ironic.api_max_retries = 60 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] ironic.api_retry_interval = 2 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] ironic.auth_section = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] ironic.auth_type = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] ironic.cafile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] ironic.certfile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] ironic.collect_timing = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] ironic.connect_retries = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] ironic.connect_retry_delay = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] ironic.endpoint_override = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] ironic.insecure = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] ironic.keyfile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] ironic.max_version = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] ironic.min_version = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] ironic.partition_key = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] ironic.peer_list = [] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] ironic.region_name = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] ironic.serial_console_state_timeout = 10 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] ironic.service_name = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] ironic.service_type = baremetal {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] ironic.split_loggers = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] ironic.status_code_retries = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] ironic.status_code_retry_delay = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] ironic.timeout = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] ironic.version = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] key_manager.fixed_key = **** {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] barbican.barbican_api_version = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] barbican.barbican_endpoint = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] barbican.barbican_endpoint_type = public {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] barbican.barbican_region_name = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] barbican.cafile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] barbican.certfile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] barbican.collect_timing = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] barbican.insecure = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] barbican.keyfile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] barbican.number_of_retries = 60 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] barbican.retry_delay = 1 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] barbican.send_service_user_token = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] barbican.split_loggers = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] barbican.timeout = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] barbican.verify_ssl = True {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] barbican.verify_ssl_path = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] barbican_service_user.auth_section = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] barbican_service_user.auth_type = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] barbican_service_user.cafile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] barbican_service_user.certfile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] barbican_service_user.collect_timing = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] barbican_service_user.insecure = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] barbican_service_user.keyfile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] barbican_service_user.split_loggers = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] barbican_service_user.timeout = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vault.approle_role_id = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vault.approle_secret_id = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vault.cafile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vault.certfile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vault.collect_timing = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vault.insecure = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vault.keyfile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vault.kv_mountpoint = secret {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vault.kv_version = 2 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vault.namespace = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vault.root_token_id = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vault.split_loggers = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vault.ssl_ca_crt_file = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vault.timeout = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vault.use_ssl = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] keystone.cafile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] keystone.certfile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] keystone.collect_timing = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] keystone.connect_retries = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] keystone.connect_retry_delay = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] keystone.endpoint_override = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] keystone.insecure = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] keystone.keyfile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] keystone.max_version = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] keystone.min_version = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] keystone.region_name = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] keystone.service_name = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] keystone.service_type = identity {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] keystone.split_loggers = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] keystone.status_code_retries = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] keystone.status_code_retry_delay = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] keystone.timeout = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] keystone.version = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.connection_uri = {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.cpu_mode = custom {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.cpu_model_extra_flags = [] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: WARNING oslo_config.cfg [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] Deprecated: Option "cpu_model" from group "libvirt" is deprecated. Use option "cpu_models" from group "libvirt". Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.cpu_models = ['Nehalem'] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.cpu_power_governor_high = performance {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.cpu_power_governor_low = powersave {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.cpu_power_management = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.device_detach_attempts = 8 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.device_detach_timeout = 20 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.disk_cachemodes = [] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.disk_prefix = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.enabled_perf_events = [] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.file_backed_memory = 0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.gid_maps = [] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.hw_disk_discard = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.hw_machine_type = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.images_rbd_ceph_conf = {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.images_rbd_glance_store_name = {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.images_rbd_pool = rbd {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.images_type = default {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.images_volume_group = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.inject_key = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.inject_partition = -2 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.inject_password = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.iscsi_iface = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.iser_use_multipath = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.live_migration_bandwidth = 0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.live_migration_completion_timeout = 800 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.live_migration_downtime = 500 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.live_migration_downtime_delay = 75 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.live_migration_downtime_steps = 10 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.live_migration_inbound_addr = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.live_migration_permit_auto_converge = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.live_migration_permit_post_copy = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.live_migration_scheme = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.live_migration_timeout_action = abort {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.live_migration_tunnelled = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: WARNING oslo_config.cfg [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Apr 23 03:42:04 user nova-compute[71428]: live_migration_uri is deprecated for removal in favor of two other options that Apr 23 03:42:04 user nova-compute[71428]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Apr 23 03:42:04 user nova-compute[71428]: and ``live_migration_inbound_addr`` respectively. Apr 23 03:42:04 user nova-compute[71428]: ). Its value may be silently ignored in the future. Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.live_migration_uri = qemu+ssh://stack@%s/system {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.live_migration_with_native_tls = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.max_queues = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.mem_stats_period_seconds = 10 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.nfs_mount_options = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.nfs_mount_point_base = /opt/stack/data/nova/mnt {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.num_aoe_discover_tries = 3 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.num_iser_scan_tries = 5 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.num_memory_encrypted_guests = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.num_nvme_discover_tries = 5 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.num_pcie_ports = 0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.num_volume_scan_tries = 5 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.pmem_namespaces = [] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.quobyte_client_cfg = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.quobyte_mount_point_base = /opt/stack/data/nova/mnt {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.rbd_connect_timeout = 5 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.rbd_secret_uuid = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.rbd_user = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.realtime_scheduler_priority = 1 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.remote_filesystem_transport = ssh {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.rescue_image_id = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.rescue_kernel_id = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.rescue_ramdisk_id = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.rng_dev_path = /dev/urandom {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.rx_queue_size = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.smbfs_mount_options = {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.smbfs_mount_point_base = /opt/stack/data/nova/mnt {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.snapshot_compression = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.snapshot_image_format = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.sparse_logical_volumes = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.swtpm_enabled = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.swtpm_group = tss {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.swtpm_user = tss {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.sysinfo_serial = unique {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.tx_queue_size = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.uid_maps = [] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.use_virtio_for_bridges = True {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.virt_type = kvm {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.volume_clear = zero {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.volume_clear_size = 0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.volume_use_multipath = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.vzstorage_cache_path = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.vzstorage_mount_group = qemu {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.vzstorage_mount_opts = [] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/nova/mnt {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.vzstorage_mount_user = stack {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] neutron.auth_section = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] neutron.auth_type = password {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] neutron.cafile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] neutron.certfile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] neutron.collect_timing = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] neutron.connect_retries = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] neutron.connect_retry_delay = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] neutron.default_floating_pool = public {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] neutron.endpoint_override = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] neutron.extension_sync_interval = 600 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] neutron.http_retries = 3 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] neutron.insecure = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] neutron.keyfile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] neutron.max_version = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] neutron.metadata_proxy_shared_secret = **** {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] neutron.min_version = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] neutron.ovs_bridge = br-int {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] neutron.physnets = [] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] neutron.region_name = RegionOne {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] neutron.service_metadata_proxy = True {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] neutron.service_name = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] neutron.service_type = network {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] neutron.split_loggers = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] neutron.status_code_retries = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] neutron.status_code_retry_delay = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] neutron.timeout = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] neutron.version = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] notifications.bdms_in_notifications = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] notifications.default_level = INFO {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] notifications.notification_format = unversioned {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] notifications.notify_on_state_change = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] pci.alias = [] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] pci.device_spec = [] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] pci.report_in_placement = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] placement.auth_section = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] placement.auth_type = password {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] placement.auth_url = http://10.0.0.210/identity {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] placement.cafile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] placement.certfile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] placement.collect_timing = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] placement.connect_retries = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] placement.connect_retry_delay = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] placement.default_domain_id = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] placement.default_domain_name = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] placement.domain_id = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] placement.domain_name = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] placement.endpoint_override = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] placement.insecure = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] placement.keyfile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] placement.max_version = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] placement.min_version = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] placement.password = **** {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] placement.project_domain_id = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] placement.project_domain_name = Default {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] placement.project_id = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] placement.project_name = service {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] placement.region_name = RegionOne {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] placement.service_name = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] placement.service_type = placement {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] placement.split_loggers = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] placement.status_code_retries = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] placement.status_code_retry_delay = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] placement.system_scope = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] placement.timeout = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] placement.trust_id = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] placement.user_domain_id = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] placement.user_domain_name = Default {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] placement.user_id = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] placement.username = placement {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] placement.version = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] quota.cores = 20 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] quota.count_usage_from_placement = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] quota.injected_file_content_bytes = 10240 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] quota.injected_file_path_length = 255 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] quota.injected_files = 5 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] quota.instances = 10 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] quota.key_pairs = 100 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] quota.metadata_items = 128 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] quota.ram = 51200 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] quota.recheck_quota = True {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] quota.server_group_members = 10 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] quota.server_groups = 10 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] rdp.enabled = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] scheduler.image_metadata_prefilter = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] scheduler.max_attempts = 3 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] scheduler.max_placement_results = 1000 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] scheduler.query_placement_for_availability_zone = True {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] scheduler.query_placement_for_image_type_support = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] scheduler.workers = 3 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] filter_scheduler.enabled_filters = ['AvailabilityZoneFilter', 'ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] filter_scheduler.host_subset_size = 1 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] filter_scheduler.image_properties_default_architecture = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] filter_scheduler.isolated_hosts = [] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] filter_scheduler.isolated_images = [] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] filter_scheduler.max_instances_per_host = 50 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] filter_scheduler.pci_in_placement = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] filter_scheduler.track_instance_changes = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] metrics.required = True {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] metrics.weight_multiplier = 1.0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] metrics.weight_of_unavailable = -10000.0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] metrics.weight_setting = [] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] serial_console.enabled = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] serial_console.port_range = 10000:20000 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] serial_console.serialproxy_port = 6083 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] service_user.auth_section = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] service_user.auth_type = password {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] service_user.cafile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] service_user.certfile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] service_user.collect_timing = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] service_user.insecure = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] service_user.keyfile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] service_user.send_service_user_token = True {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] service_user.split_loggers = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] service_user.timeout = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] spice.agent_enabled = True {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] spice.enabled = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] spice.html5proxy_base_url = http://10.0.0.210:6081/spice_auto.html {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] spice.html5proxy_host = 0.0.0.0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] spice.html5proxy_port = 6082 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] spice.image_compression = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] spice.jpeg_compression = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] spice.playback_compression = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] spice.server_listen = 127.0.0.1 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] spice.streaming_mode = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] spice.zlib_compression = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] upgrade_levels.baseapi = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] upgrade_levels.cert = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] upgrade_levels.compute = auto {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] upgrade_levels.conductor = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] upgrade_levels.scheduler = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vendordata_dynamic_auth.auth_section = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vendordata_dynamic_auth.auth_type = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vendordata_dynamic_auth.cafile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vendordata_dynamic_auth.certfile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vendordata_dynamic_auth.collect_timing = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vendordata_dynamic_auth.insecure = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vendordata_dynamic_auth.keyfile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vendordata_dynamic_auth.split_loggers = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vendordata_dynamic_auth.timeout = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vmware.api_retry_count = 10 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vmware.ca_file = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vmware.cache_prefix = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vmware.cluster_name = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vmware.connection_pool_size = 10 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vmware.console_delay_seconds = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vmware.datastore_regex = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vmware.host_ip = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vmware.host_password = **** {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vmware.host_port = 443 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vmware.host_username = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vmware.insecure = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vmware.integration_bridge = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vmware.maximum_objects = 100 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vmware.pbm_default_policy = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vmware.pbm_enabled = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vmware.pbm_wsdl_location = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vmware.serial_port_proxy_uri = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vmware.serial_port_service_uri = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vmware.task_poll_interval = 0.5 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vmware.use_linked_clone = True {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vmware.vnc_keymap = en-us {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vmware.vnc_port = 5900 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vmware.vnc_port_total = 10000 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vnc.auth_schemes = ['none'] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vnc.enabled = True {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vnc.novncproxy_base_url = http://10.0.0.210:6080/vnc_lite.html {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vnc.novncproxy_port = 6080 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vnc.server_listen = 0.0.0.0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vnc.server_proxyclient_address = 10.0.0.210 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vnc.vencrypt_ca_certs = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vnc.vencrypt_client_cert = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vnc.vencrypt_client_key = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] workarounds.disable_fallback_pcpu_query = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] workarounds.disable_group_policy_check_upcall = True {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] workarounds.disable_rootwrap = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] workarounds.enable_numa_live_migration = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] workarounds.handle_virt_lifecycle_events = True {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] workarounds.libvirt_disable_apic = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] workarounds.never_download_image_if_on_rbd = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] wsgi.client_socket_timeout = 900 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] wsgi.default_pool_size = 1000 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] wsgi.keep_alive = True {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] wsgi.max_header_line = 16384 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] wsgi.secure_proxy_ssl_header = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] wsgi.ssl_ca_file = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] wsgi.ssl_cert_file = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] wsgi.ssl_key_file = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] wsgi.tcp_keepidle = 600 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] zvm.ca_file = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] zvm.cloud_connector_url = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] zvm.image_tmp_path = /opt/stack/data/nova/images {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] zvm.reachable_timeout = 300 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_policy.enforce_new_defaults = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_policy.enforce_scope = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_policy.policy_default_rule = default {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_policy.policy_file = policy.yaml {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] profiler.connection_string = messaging:// {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] profiler.enabled = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] profiler.es_doc_type = notification {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] profiler.es_scroll_size = 10000 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] profiler.es_scroll_time = 2m {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] profiler.filter_error_trace = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] profiler.hmac_keys = SECRET_KEY {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] profiler.sentinel_service_name = mymaster {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] profiler.socket_timeout = 0.1 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] profiler.trace_sqlalchemy = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] remote_debug.host = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] remote_debug.port = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_rabbit.rabbit_quroum_max_memory_bytes = 0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_rabbit.rabbit_quroum_max_memory_length = 0 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_rabbit.ssl = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_rabbit.ssl_version = {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_notifications.retry = -1 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_messaging_notifications.transport_url = **** {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_limit.auth_section = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_limit.auth_type = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_limit.cafile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_limit.certfile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_limit.collect_timing = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_limit.connect_retries = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_limit.connect_retry_delay = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_limit.endpoint_id = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_limit.endpoint_override = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_limit.insecure = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_limit.keyfile = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_limit.max_version = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_limit.min_version = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_limit.region_name = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_limit.service_name = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_limit.service_type = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_limit.split_loggers = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_limit.status_code_retries = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_limit.status_code_retry_delay = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_limit.timeout = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_limit.valid_interfaces = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_limit.version = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_reports.file_event_handler = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_reports.file_event_handler_interval = 1 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] oslo_reports.log_dir = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vif_plug_linux_bridge_privileged.group = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vif_plug_linux_bridge_privileged.thread_pool_size = 12 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vif_plug_linux_bridge_privileged.user = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vif_plug_ovs_privileged.group = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vif_plug_ovs_privileged.helper_command = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vif_plug_ovs_privileged.thread_pool_size = 12 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] vif_plug_ovs_privileged.user = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] os_vif_linux_bridge.flat_interface = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] os_vif_linux_bridge.vlan_interface = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] os_vif_ovs.isolate_vif = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] os_vif_ovs.ovsdb_interface = native {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] os_vif_ovs.per_port_bridge = False {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] os_brick.lock_path = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] privsep_osbrick.capabilities = [21] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] privsep_osbrick.group = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] privsep_osbrick.helper_command = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] privsep_osbrick.thread_pool_size = 12 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] privsep_osbrick.user = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] nova_sys_admin.group = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] nova_sys_admin.helper_command = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] nova_sys_admin.thread_pool_size = 12 {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] nova_sys_admin.user = None {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG oslo_service.service [None req-463b00f9-3329-41cc-a2b8-ef504ec7bfb6 None None] ******************************************************************************** {{(pid=71428) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} Apr 23 03:42:04 user nova-compute[71428]: INFO nova.service [-] Starting compute node (version 0.0.0) Apr 23 03:42:04 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Starting native event thread {{(pid=71428) _init_events /opt/stack/nova/nova/virt/libvirt/host.py:492}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Starting green dispatch thread {{(pid=71428) _init_events /opt/stack/nova/nova/virt/libvirt/host.py:498}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Starting connection event dispatch thread {{(pid=71428) initialize /opt/stack/nova/nova/virt/libvirt/host.py:620}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Connecting to libvirt: qemu:///system {{(pid=71428) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:503}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Registering for lifecycle events {{(pid=71428) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:509}} Apr 23 03:42:04 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Registering for connection events: {{(pid=71428) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:530}} Apr 23 03:42:04 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Connection event '1' reason 'None' Apr 23 03:42:04 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Cannot update service status on host "user" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host user could not be found. Apr 23 03:42:04 user nova-compute[71428]: DEBUG nova.virt.libvirt.volume.mount [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Initialising _HostMountState generation 0 {{(pid=71428) host_up /opt/stack/nova/nova/virt/libvirt/volume/mount.py:130}} Apr 23 03:42:11 user nova-compute[71428]: INFO nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Libvirt host capabilities Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: e20c3142-5af9-7467-ecd8-70b2e4a210d6 Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: x86_64 Apr 23 03:42:11 user nova-compute[71428]: IvyBridge-IBRS Apr 23 03:42:11 user nova-compute[71428]: Intel Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: tcp Apr 23 03:42:11 user nova-compute[71428]: rdma Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: 8152920 Apr 23 03:42:11 user nova-compute[71428]: 2038230 Apr 23 03:42:11 user nova-compute[71428]: 0 Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: 8255100 Apr 23 03:42:11 user nova-compute[71428]: 2063775 Apr 23 03:42:11 user nova-compute[71428]: 0 Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: apparmor Apr 23 03:42:11 user nova-compute[71428]: 0 Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: dac Apr 23 03:42:11 user nova-compute[71428]: 0 Apr 23 03:42:11 user nova-compute[71428]: +64055:+108 Apr 23 03:42:11 user nova-compute[71428]: +64055:+108 Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: hvm Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: 64 Apr 23 03:42:11 user nova-compute[71428]: /usr/bin/qemu-system-alpha Apr 23 03:42:11 user nova-compute[71428]: clipper Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: hvm Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: 32 Apr 23 03:42:11 user nova-compute[71428]: /usr/bin/qemu-system-arm Apr 23 03:42:11 user nova-compute[71428]: integratorcp Apr 23 03:42:11 user nova-compute[71428]: ast2600-evb Apr 23 03:42:11 user nova-compute[71428]: borzoi Apr 23 03:42:11 user nova-compute[71428]: spitz Apr 23 03:42:11 user nova-compute[71428]: virt-2.7 Apr 23 03:42:11 user nova-compute[71428]: nuri Apr 23 03:42:11 user nova-compute[71428]: mcimx7d-sabre Apr 23 03:42:11 user nova-compute[71428]: romulus-bmc Apr 23 03:42:11 user nova-compute[71428]: virt-3.0 Apr 23 03:42:11 user nova-compute[71428]: virt-5.0 Apr 23 03:42:11 user nova-compute[71428]: npcm750-evb Apr 23 03:42:11 user nova-compute[71428]: virt-2.10 Apr 23 03:42:11 user nova-compute[71428]: rainier-bmc Apr 23 03:42:11 user nova-compute[71428]: mps3-an547 Apr 23 03:42:11 user nova-compute[71428]: musca-b1 Apr 23 03:42:11 user nova-compute[71428]: realview-pbx-a9 Apr 23 03:42:11 user nova-compute[71428]: versatileab Apr 23 03:42:11 user nova-compute[71428]: kzm Apr 23 03:42:11 user nova-compute[71428]: virt-2.8 Apr 23 03:42:11 user nova-compute[71428]: musca-a Apr 23 03:42:11 user nova-compute[71428]: virt-3.1 Apr 23 03:42:11 user nova-compute[71428]: mcimx6ul-evk Apr 23 03:42:11 user nova-compute[71428]: virt-5.1 Apr 23 03:42:11 user nova-compute[71428]: smdkc210 Apr 23 03:42:11 user nova-compute[71428]: sx1 Apr 23 03:42:11 user nova-compute[71428]: virt-2.11 Apr 23 03:42:11 user nova-compute[71428]: imx25-pdk Apr 23 03:42:11 user nova-compute[71428]: stm32vldiscovery Apr 23 03:42:11 user nova-compute[71428]: virt-2.9 Apr 23 03:42:11 user nova-compute[71428]: orangepi-pc Apr 23 03:42:11 user nova-compute[71428]: quanta-q71l-bmc Apr 23 03:42:11 user nova-compute[71428]: z2 Apr 23 03:42:11 user nova-compute[71428]: virt-5.2 Apr 23 03:42:11 user nova-compute[71428]: xilinx-zynq-a9 Apr 23 03:42:11 user nova-compute[71428]: tosa Apr 23 03:42:11 user nova-compute[71428]: mps2-an500 Apr 23 03:42:11 user nova-compute[71428]: virt-2.12 Apr 23 03:42:11 user nova-compute[71428]: mps2-an521 Apr 23 03:42:11 user nova-compute[71428]: sabrelite Apr 23 03:42:11 user nova-compute[71428]: mps2-an511 Apr 23 03:42:11 user nova-compute[71428]: canon-a1100 Apr 23 03:42:11 user nova-compute[71428]: realview-eb Apr 23 03:42:11 user nova-compute[71428]: quanta-gbs-bmc Apr 23 03:42:11 user nova-compute[71428]: emcraft-sf2 Apr 23 03:42:11 user nova-compute[71428]: realview-pb-a8 Apr 23 03:42:11 user nova-compute[71428]: virt-4.0 Apr 23 03:42:11 user nova-compute[71428]: raspi1ap Apr 23 03:42:11 user nova-compute[71428]: palmetto-bmc Apr 23 03:42:11 user nova-compute[71428]: sx1-v1 Apr 23 03:42:11 user nova-compute[71428]: n810 Apr 23 03:42:11 user nova-compute[71428]: g220a-bmc Apr 23 03:42:11 user nova-compute[71428]: n800 Apr 23 03:42:11 user nova-compute[71428]: tacoma-bmc Apr 23 03:42:11 user nova-compute[71428]: virt-4.1 Apr 23 03:42:11 user nova-compute[71428]: quanta-gsj Apr 23 03:42:11 user nova-compute[71428]: versatilepb Apr 23 03:42:11 user nova-compute[71428]: terrier Apr 23 03:42:11 user nova-compute[71428]: mainstone Apr 23 03:42:11 user nova-compute[71428]: realview-eb-mpcore Apr 23 03:42:11 user nova-compute[71428]: supermicrox11-bmc Apr 23 03:42:11 user nova-compute[71428]: virt-4.2 Apr 23 03:42:11 user nova-compute[71428]: witherspoon-bmc Apr 23 03:42:11 user nova-compute[71428]: mps3-an524 Apr 23 03:42:11 user nova-compute[71428]: swift-bmc Apr 23 03:42:11 user nova-compute[71428]: kudo-bmc Apr 23 03:42:11 user nova-compute[71428]: vexpress-a9 Apr 23 03:42:11 user nova-compute[71428]: midway Apr 23 03:42:11 user nova-compute[71428]: musicpal Apr 23 03:42:11 user nova-compute[71428]: lm3s811evb Apr 23 03:42:11 user nova-compute[71428]: lm3s6965evb Apr 23 03:42:11 user nova-compute[71428]: microbit Apr 23 03:42:11 user nova-compute[71428]: mps2-an505 Apr 23 03:42:11 user nova-compute[71428]: mps2-an385 Apr 23 03:42:11 user nova-compute[71428]: virt-6.0 Apr 23 03:42:11 user nova-compute[71428]: cubieboard Apr 23 03:42:11 user nova-compute[71428]: verdex Apr 23 03:42:11 user nova-compute[71428]: netduino2 Apr 23 03:42:11 user nova-compute[71428]: mps2-an386 Apr 23 03:42:11 user nova-compute[71428]: virt-6.1 Apr 23 03:42:11 user nova-compute[71428]: raspi2b Apr 23 03:42:11 user nova-compute[71428]: vexpress-a15 Apr 23 03:42:11 user nova-compute[71428]: fuji-bmc Apr 23 03:42:11 user nova-compute[71428]: virt-6.2 Apr 23 03:42:11 user nova-compute[71428]: virt Apr 23 03:42:11 user nova-compute[71428]: sonorapass-bmc Apr 23 03:42:11 user nova-compute[71428]: cheetah Apr 23 03:42:11 user nova-compute[71428]: virt-2.6 Apr 23 03:42:11 user nova-compute[71428]: ast2500-evb Apr 23 03:42:11 user nova-compute[71428]: highbank Apr 23 03:42:11 user nova-compute[71428]: akita Apr 23 03:42:11 user nova-compute[71428]: connex Apr 23 03:42:11 user nova-compute[71428]: netduinoplus2 Apr 23 03:42:11 user nova-compute[71428]: collie Apr 23 03:42:11 user nova-compute[71428]: raspi0 Apr 23 03:42:11 user nova-compute[71428]: fp5280g2-bmc Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: hvm Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: 32 Apr 23 03:42:11 user nova-compute[71428]: /usr/bin/qemu-system-arm Apr 23 03:42:11 user nova-compute[71428]: integratorcp Apr 23 03:42:11 user nova-compute[71428]: ast2600-evb Apr 23 03:42:11 user nova-compute[71428]: borzoi Apr 23 03:42:11 user nova-compute[71428]: spitz Apr 23 03:42:11 user nova-compute[71428]: virt-2.7 Apr 23 03:42:11 user nova-compute[71428]: nuri Apr 23 03:42:11 user nova-compute[71428]: mcimx7d-sabre Apr 23 03:42:11 user nova-compute[71428]: romulus-bmc Apr 23 03:42:11 user nova-compute[71428]: virt-3.0 Apr 23 03:42:11 user nova-compute[71428]: virt-5.0 Apr 23 03:42:11 user nova-compute[71428]: npcm750-evb Apr 23 03:42:11 user nova-compute[71428]: virt-2.10 Apr 23 03:42:11 user nova-compute[71428]: rainier-bmc Apr 23 03:42:11 user nova-compute[71428]: mps3-an547 Apr 23 03:42:11 user nova-compute[71428]: musca-b1 Apr 23 03:42:11 user nova-compute[71428]: realview-pbx-a9 Apr 23 03:42:11 user nova-compute[71428]: versatileab Apr 23 03:42:11 user nova-compute[71428]: kzm Apr 23 03:42:11 user nova-compute[71428]: virt-2.8 Apr 23 03:42:11 user nova-compute[71428]: musca-a Apr 23 03:42:11 user nova-compute[71428]: virt-3.1 Apr 23 03:42:11 user nova-compute[71428]: mcimx6ul-evk Apr 23 03:42:11 user nova-compute[71428]: virt-5.1 Apr 23 03:42:11 user nova-compute[71428]: smdkc210 Apr 23 03:42:11 user nova-compute[71428]: sx1 Apr 23 03:42:11 user nova-compute[71428]: virt-2.11 Apr 23 03:42:11 user nova-compute[71428]: imx25-pdk Apr 23 03:42:11 user nova-compute[71428]: stm32vldiscovery Apr 23 03:42:11 user nova-compute[71428]: virt-2.9 Apr 23 03:42:11 user nova-compute[71428]: orangepi-pc Apr 23 03:42:11 user nova-compute[71428]: quanta-q71l-bmc Apr 23 03:42:11 user nova-compute[71428]: z2 Apr 23 03:42:11 user nova-compute[71428]: virt-5.2 Apr 23 03:42:11 user nova-compute[71428]: xilinx-zynq-a9 Apr 23 03:42:11 user nova-compute[71428]: tosa Apr 23 03:42:11 user nova-compute[71428]: mps2-an500 Apr 23 03:42:11 user nova-compute[71428]: virt-2.12 Apr 23 03:42:11 user nova-compute[71428]: mps2-an521 Apr 23 03:42:11 user nova-compute[71428]: sabrelite Apr 23 03:42:11 user nova-compute[71428]: mps2-an511 Apr 23 03:42:11 user nova-compute[71428]: canon-a1100 Apr 23 03:42:11 user nova-compute[71428]: realview-eb Apr 23 03:42:11 user nova-compute[71428]: quanta-gbs-bmc Apr 23 03:42:11 user nova-compute[71428]: emcraft-sf2 Apr 23 03:42:11 user nova-compute[71428]: realview-pb-a8 Apr 23 03:42:11 user nova-compute[71428]: virt-4.0 Apr 23 03:42:11 user nova-compute[71428]: raspi1ap Apr 23 03:42:11 user nova-compute[71428]: palmetto-bmc Apr 23 03:42:11 user nova-compute[71428]: sx1-v1 Apr 23 03:42:11 user nova-compute[71428]: n810 Apr 23 03:42:11 user nova-compute[71428]: g220a-bmc Apr 23 03:42:11 user nova-compute[71428]: n800 Apr 23 03:42:11 user nova-compute[71428]: tacoma-bmc Apr 23 03:42:11 user nova-compute[71428]: virt-4.1 Apr 23 03:42:11 user nova-compute[71428]: quanta-gsj Apr 23 03:42:11 user nova-compute[71428]: versatilepb Apr 23 03:42:11 user nova-compute[71428]: terrier Apr 23 03:42:11 user nova-compute[71428]: mainstone Apr 23 03:42:11 user nova-compute[71428]: realview-eb-mpcore Apr 23 03:42:11 user nova-compute[71428]: supermicrox11-bmc Apr 23 03:42:11 user nova-compute[71428]: virt-4.2 Apr 23 03:42:11 user nova-compute[71428]: witherspoon-bmc Apr 23 03:42:11 user nova-compute[71428]: mps3-an524 Apr 23 03:42:11 user nova-compute[71428]: swift-bmc Apr 23 03:42:11 user nova-compute[71428]: kudo-bmc Apr 23 03:42:11 user nova-compute[71428]: vexpress-a9 Apr 23 03:42:11 user nova-compute[71428]: midway Apr 23 03:42:11 user nova-compute[71428]: musicpal Apr 23 03:42:11 user nova-compute[71428]: lm3s811evb Apr 23 03:42:11 user nova-compute[71428]: lm3s6965evb Apr 23 03:42:11 user nova-compute[71428]: microbit Apr 23 03:42:11 user nova-compute[71428]: mps2-an505 Apr 23 03:42:11 user nova-compute[71428]: mps2-an385 Apr 23 03:42:11 user nova-compute[71428]: virt-6.0 Apr 23 03:42:11 user nova-compute[71428]: cubieboard Apr 23 03:42:11 user nova-compute[71428]: verdex Apr 23 03:42:11 user nova-compute[71428]: netduino2 Apr 23 03:42:11 user nova-compute[71428]: mps2-an386 Apr 23 03:42:11 user nova-compute[71428]: virt-6.1 Apr 23 03:42:11 user nova-compute[71428]: raspi2b Apr 23 03:42:11 user nova-compute[71428]: vexpress-a15 Apr 23 03:42:11 user nova-compute[71428]: fuji-bmc Apr 23 03:42:11 user nova-compute[71428]: virt-6.2 Apr 23 03:42:11 user nova-compute[71428]: virt Apr 23 03:42:11 user nova-compute[71428]: sonorapass-bmc Apr 23 03:42:11 user nova-compute[71428]: cheetah Apr 23 03:42:11 user nova-compute[71428]: virt-2.6 Apr 23 03:42:11 user nova-compute[71428]: ast2500-evb Apr 23 03:42:11 user nova-compute[71428]: highbank Apr 23 03:42:11 user nova-compute[71428]: akita Apr 23 03:42:11 user nova-compute[71428]: connex Apr 23 03:42:11 user nova-compute[71428]: netduinoplus2 Apr 23 03:42:11 user nova-compute[71428]: collie Apr 23 03:42:11 user nova-compute[71428]: raspi0 Apr 23 03:42:11 user nova-compute[71428]: fp5280g2-bmc Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: hvm Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: 64 Apr 23 03:42:11 user nova-compute[71428]: /usr/bin/qemu-system-aarch64 Apr 23 03:42:11 user nova-compute[71428]: integratorcp Apr 23 03:42:11 user nova-compute[71428]: ast2600-evb Apr 23 03:42:11 user nova-compute[71428]: borzoi Apr 23 03:42:11 user nova-compute[71428]: spitz Apr 23 03:42:11 user nova-compute[71428]: virt-2.7 Apr 23 03:42:11 user nova-compute[71428]: nuri Apr 23 03:42:11 user nova-compute[71428]: mcimx7d-sabre Apr 23 03:42:11 user nova-compute[71428]: romulus-bmc Apr 23 03:42:11 user nova-compute[71428]: virt-3.0 Apr 23 03:42:11 user nova-compute[71428]: virt-5.0 Apr 23 03:42:11 user nova-compute[71428]: npcm750-evb Apr 23 03:42:11 user nova-compute[71428]: virt-2.10 Apr 23 03:42:11 user nova-compute[71428]: rainier-bmc Apr 23 03:42:11 user nova-compute[71428]: mps3-an547 Apr 23 03:42:11 user nova-compute[71428]: virt-2.8 Apr 23 03:42:11 user nova-compute[71428]: musca-b1 Apr 23 03:42:11 user nova-compute[71428]: realview-pbx-a9 Apr 23 03:42:11 user nova-compute[71428]: versatileab Apr 23 03:42:11 user nova-compute[71428]: kzm Apr 23 03:42:11 user nova-compute[71428]: musca-a Apr 23 03:42:11 user nova-compute[71428]: virt-3.1 Apr 23 03:42:11 user nova-compute[71428]: mcimx6ul-evk Apr 23 03:42:11 user nova-compute[71428]: virt-5.1 Apr 23 03:42:11 user nova-compute[71428]: smdkc210 Apr 23 03:42:11 user nova-compute[71428]: sx1 Apr 23 03:42:11 user nova-compute[71428]: virt-2.11 Apr 23 03:42:11 user nova-compute[71428]: imx25-pdk Apr 23 03:42:11 user nova-compute[71428]: stm32vldiscovery Apr 23 03:42:11 user nova-compute[71428]: virt-2.9 Apr 23 03:42:11 user nova-compute[71428]: orangepi-pc Apr 23 03:42:11 user nova-compute[71428]: quanta-q71l-bmc Apr 23 03:42:11 user nova-compute[71428]: z2 Apr 23 03:42:11 user nova-compute[71428]: virt-5.2 Apr 23 03:42:11 user nova-compute[71428]: xilinx-zynq-a9 Apr 23 03:42:11 user nova-compute[71428]: xlnx-zcu102 Apr 23 03:42:11 user nova-compute[71428]: tosa Apr 23 03:42:11 user nova-compute[71428]: mps2-an500 Apr 23 03:42:11 user nova-compute[71428]: virt-2.12 Apr 23 03:42:11 user nova-compute[71428]: mps2-an521 Apr 23 03:42:11 user nova-compute[71428]: sabrelite Apr 23 03:42:11 user nova-compute[71428]: mps2-an511 Apr 23 03:42:11 user nova-compute[71428]: canon-a1100 Apr 23 03:42:11 user nova-compute[71428]: realview-eb Apr 23 03:42:11 user nova-compute[71428]: quanta-gbs-bmc Apr 23 03:42:11 user nova-compute[71428]: emcraft-sf2 Apr 23 03:42:11 user nova-compute[71428]: realview-pb-a8 Apr 23 03:42:11 user nova-compute[71428]: sbsa-ref Apr 23 03:42:11 user nova-compute[71428]: virt-4.0 Apr 23 03:42:11 user nova-compute[71428]: raspi1ap Apr 23 03:42:11 user nova-compute[71428]: palmetto-bmc Apr 23 03:42:11 user nova-compute[71428]: sx1-v1 Apr 23 03:42:11 user nova-compute[71428]: n810 Apr 23 03:42:11 user nova-compute[71428]: g220a-bmc Apr 23 03:42:11 user nova-compute[71428]: n800 Apr 23 03:42:11 user nova-compute[71428]: tacoma-bmc Apr 23 03:42:11 user nova-compute[71428]: virt-4.1 Apr 23 03:42:11 user nova-compute[71428]: quanta-gsj Apr 23 03:42:11 user nova-compute[71428]: versatilepb Apr 23 03:42:11 user nova-compute[71428]: terrier Apr 23 03:42:11 user nova-compute[71428]: mainstone Apr 23 03:42:11 user nova-compute[71428]: realview-eb-mpcore Apr 23 03:42:11 user nova-compute[71428]: supermicrox11-bmc Apr 23 03:42:11 user nova-compute[71428]: virt-4.2 Apr 23 03:42:11 user nova-compute[71428]: witherspoon-bmc Apr 23 03:42:11 user nova-compute[71428]: mps3-an524 Apr 23 03:42:11 user nova-compute[71428]: swift-bmc Apr 23 03:42:11 user nova-compute[71428]: kudo-bmc Apr 23 03:42:11 user nova-compute[71428]: vexpress-a9 Apr 23 03:42:11 user nova-compute[71428]: midway Apr 23 03:42:11 user nova-compute[71428]: musicpal Apr 23 03:42:11 user nova-compute[71428]: lm3s811evb Apr 23 03:42:11 user nova-compute[71428]: lm3s6965evb Apr 23 03:42:11 user nova-compute[71428]: microbit Apr 23 03:42:11 user nova-compute[71428]: mps2-an505 Apr 23 03:42:11 user nova-compute[71428]: mps2-an385 Apr 23 03:42:11 user nova-compute[71428]: virt-6.0 Apr 23 03:42:11 user nova-compute[71428]: raspi3ap Apr 23 03:42:11 user nova-compute[71428]: cubieboard Apr 23 03:42:11 user nova-compute[71428]: verdex Apr 23 03:42:11 user nova-compute[71428]: netduino2 Apr 23 03:42:11 user nova-compute[71428]: xlnx-versal-virt Apr 23 03:42:11 user nova-compute[71428]: mps2-an386 Apr 23 03:42:11 user nova-compute[71428]: virt-6.1 Apr 23 03:42:11 user nova-compute[71428]: raspi3b Apr 23 03:42:11 user nova-compute[71428]: raspi2b Apr 23 03:42:11 user nova-compute[71428]: vexpress-a15 Apr 23 03:42:11 user nova-compute[71428]: fuji-bmc Apr 23 03:42:11 user nova-compute[71428]: virt-6.2 Apr 23 03:42:11 user nova-compute[71428]: virt Apr 23 03:42:11 user nova-compute[71428]: sonorapass-bmc Apr 23 03:42:11 user nova-compute[71428]: cheetah Apr 23 03:42:11 user nova-compute[71428]: virt-2.6 Apr 23 03:42:11 user nova-compute[71428]: ast2500-evb Apr 23 03:42:11 user nova-compute[71428]: highbank Apr 23 03:42:11 user nova-compute[71428]: akita Apr 23 03:42:11 user nova-compute[71428]: connex Apr 23 03:42:11 user nova-compute[71428]: netduinoplus2 Apr 23 03:42:11 user nova-compute[71428]: collie Apr 23 03:42:11 user nova-compute[71428]: raspi0 Apr 23 03:42:11 user nova-compute[71428]: fp5280g2-bmc Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: hvm Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: 32 Apr 23 03:42:11 user nova-compute[71428]: /usr/bin/qemu-system-cris Apr 23 03:42:11 user nova-compute[71428]: axis-dev88 Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: hvm Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: 32 Apr 23 03:42:11 user nova-compute[71428]: /usr/bin/qemu-system-i386 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-jammy Apr 23 03:42:11 user nova-compute[71428]: ubuntu Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-impish-hpb Apr 23 03:42:11 user nova-compute[71428]: pc-q35-5.2 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-2.12 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-2.0 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-xenial Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-6.2 Apr 23 03:42:11 user nova-compute[71428]: pc Apr 23 03:42:11 user nova-compute[71428]: pc-q35-4.2 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-2.5 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-4.2 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-focal Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-hirsute Apr 23 03:42:11 user nova-compute[71428]: pc-q35-xenial Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-jammy-hpb Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-5.2 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-1.5 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-2.7 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-eoan-hpb Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-zesty Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-disco-hpb Apr 23 03:42:11 user nova-compute[71428]: pc-q35-groovy Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-groovy Apr 23 03:42:11 user nova-compute[71428]: pc-q35-artful Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-2.2 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-trusty Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-eoan-hpb Apr 23 03:42:11 user nova-compute[71428]: pc-q35-focal-hpb Apr 23 03:42:11 user nova-compute[71428]: pc-q35-bionic-hpb Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-artful Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-2.7 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-6.1 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-yakkety Apr 23 03:42:11 user nova-compute[71428]: pc-q35-2.4 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-cosmic-hpb Apr 23 03:42:11 user nova-compute[71428]: pc-q35-2.10 Apr 23 03:42:11 user nova-compute[71428]: x-remote Apr 23 03:42:11 user nova-compute[71428]: pc-q35-5.1 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-1.7 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-2.9 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-2.11 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-3.1 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-6.1 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-4.1 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-jammy Apr 23 03:42:11 user nova-compute[71428]: ubuntu-q35 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-2.4 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-4.1 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-eoan Apr 23 03:42:11 user nova-compute[71428]: pc-q35-jammy-hpb Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-5.1 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-2.9 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-bionic-hpb Apr 23 03:42:11 user nova-compute[71428]: isapc Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-1.4 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-cosmic Apr 23 03:42:11 user nova-compute[71428]: pc-q35-2.6 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-3.1 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-bionic Apr 23 03:42:11 user nova-compute[71428]: pc-q35-disco-hpb Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-cosmic Apr 23 03:42:11 user nova-compute[71428]: pc-q35-2.12 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-bionic Apr 23 03:42:11 user nova-compute[71428]: pc-q35-groovy-hpb Apr 23 03:42:11 user nova-compute[71428]: pc-q35-disco Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-cosmic-hpb Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-2.1 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-wily Apr 23 03:42:11 user nova-compute[71428]: pc-q35-impish Apr 23 03:42:11 user nova-compute[71428]: pc-q35-6.0 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-impish Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-2.6 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-impish-hpb Apr 23 03:42:11 user nova-compute[71428]: pc-q35-hirsute Apr 23 03:42:11 user nova-compute[71428]: pc-q35-4.0.1 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-hirsute-hpb Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-1.6 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-5.0 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-2.8 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-2.10 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-3.0 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-6.0 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-zesty Apr 23 03:42:11 user nova-compute[71428]: pc-q35-4.0 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-focal Apr 23 03:42:11 user nova-compute[71428]: microvm Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-2.3 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-focal-hpb Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-disco Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-4.0 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-groovy-hpb Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-hirsute-hpb Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-5.0 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-6.2 Apr 23 03:42:11 user nova-compute[71428]: q35 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-2.8 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-eoan Apr 23 03:42:11 user nova-compute[71428]: pc-q35-2.5 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-3.0 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-yakkety Apr 23 03:42:11 user nova-compute[71428]: pc-q35-2.11 Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: hvm Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: 32 Apr 23 03:42:11 user nova-compute[71428]: /usr/bin/qemu-system-m68k Apr 23 03:42:11 user nova-compute[71428]: mcf5208evb Apr 23 03:42:11 user nova-compute[71428]: an5206 Apr 23 03:42:11 user nova-compute[71428]: virt-6.0 Apr 23 03:42:11 user nova-compute[71428]: q800 Apr 23 03:42:11 user nova-compute[71428]: virt-6.2 Apr 23 03:42:11 user nova-compute[71428]: virt Apr 23 03:42:11 user nova-compute[71428]: next-cube Apr 23 03:42:11 user nova-compute[71428]: virt-6.1 Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: hvm Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: 32 Apr 23 03:42:11 user nova-compute[71428]: /usr/bin/qemu-system-microblaze Apr 23 03:42:11 user nova-compute[71428]: petalogix-s3adsp1800 Apr 23 03:42:11 user nova-compute[71428]: petalogix-ml605 Apr 23 03:42:11 user nova-compute[71428]: xlnx-zynqmp-pmu Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: hvm Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: 32 Apr 23 03:42:11 user nova-compute[71428]: /usr/bin/qemu-system-microblazeel Apr 23 03:42:11 user nova-compute[71428]: petalogix-s3adsp1800 Apr 23 03:42:11 user nova-compute[71428]: petalogix-ml605 Apr 23 03:42:11 user nova-compute[71428]: xlnx-zynqmp-pmu Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: hvm Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: 32 Apr 23 03:42:11 user nova-compute[71428]: /usr/bin/qemu-system-mips Apr 23 03:42:11 user nova-compute[71428]: malta Apr 23 03:42:11 user nova-compute[71428]: mipssim Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: hvm Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: 32 Apr 23 03:42:11 user nova-compute[71428]: /usr/bin/qemu-system-mipsel Apr 23 03:42:11 user nova-compute[71428]: malta Apr 23 03:42:11 user nova-compute[71428]: mipssim Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: hvm Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: 64 Apr 23 03:42:11 user nova-compute[71428]: /usr/bin/qemu-system-mips64 Apr 23 03:42:11 user nova-compute[71428]: malta Apr 23 03:42:11 user nova-compute[71428]: mipssim Apr 23 03:42:11 user nova-compute[71428]: pica61 Apr 23 03:42:11 user nova-compute[71428]: magnum Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: hvm Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: 64 Apr 23 03:42:11 user nova-compute[71428]: /usr/bin/qemu-system-mips64el Apr 23 03:42:11 user nova-compute[71428]: malta Apr 23 03:42:11 user nova-compute[71428]: loongson3-virt Apr 23 03:42:11 user nova-compute[71428]: mipssim Apr 23 03:42:11 user nova-compute[71428]: pica61 Apr 23 03:42:11 user nova-compute[71428]: magnum Apr 23 03:42:11 user nova-compute[71428]: boston Apr 23 03:42:11 user nova-compute[71428]: fuloong2e Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: hvm Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: 32 Apr 23 03:42:11 user nova-compute[71428]: /usr/bin/qemu-system-ppc Apr 23 03:42:11 user nova-compute[71428]: g3beige Apr 23 03:42:11 user nova-compute[71428]: virtex-ml507 Apr 23 03:42:11 user nova-compute[71428]: mac99 Apr 23 03:42:11 user nova-compute[71428]: ppce500 Apr 23 03:42:11 user nova-compute[71428]: pegasos2 Apr 23 03:42:11 user nova-compute[71428]: sam460ex Apr 23 03:42:11 user nova-compute[71428]: bamboo Apr 23 03:42:11 user nova-compute[71428]: 40p Apr 23 03:42:11 user nova-compute[71428]: ref405ep Apr 23 03:42:11 user nova-compute[71428]: mpc8544ds Apr 23 03:42:11 user nova-compute[71428]: taihu Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: hvm Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: 64 Apr 23 03:42:11 user nova-compute[71428]: /usr/bin/qemu-system-ppc64 Apr 23 03:42:11 user nova-compute[71428]: pseries-jammy Apr 23 03:42:11 user nova-compute[71428]: pseries Apr 23 03:42:11 user nova-compute[71428]: powernv9 Apr 23 03:42:11 user nova-compute[71428]: powernv Apr 23 03:42:11 user nova-compute[71428]: taihu Apr 23 03:42:11 user nova-compute[71428]: pseries-4.1 Apr 23 03:42:11 user nova-compute[71428]: mpc8544ds Apr 23 03:42:11 user nova-compute[71428]: pseries-6.1 Apr 23 03:42:11 user nova-compute[71428]: pseries-2.5 Apr 23 03:42:11 user nova-compute[71428]: powernv10 Apr 23 03:42:11 user nova-compute[71428]: pseries-xenial Apr 23 03:42:11 user nova-compute[71428]: pseries-4.2 Apr 23 03:42:11 user nova-compute[71428]: pseries-6.2 Apr 23 03:42:11 user nova-compute[71428]: pseries-yakkety Apr 23 03:42:11 user nova-compute[71428]: pseries-2.6 Apr 23 03:42:11 user nova-compute[71428]: ppce500 Apr 23 03:42:11 user nova-compute[71428]: pseries-bionic-sxxm Apr 23 03:42:11 user nova-compute[71428]: pseries-2.7 Apr 23 03:42:11 user nova-compute[71428]: pseries-3.0 Apr 23 03:42:11 user nova-compute[71428]: pseries-5.0 Apr 23 03:42:11 user nova-compute[71428]: 40p Apr 23 03:42:11 user nova-compute[71428]: pseries-2.8 Apr 23 03:42:11 user nova-compute[71428]: pegasos2 Apr 23 03:42:11 user nova-compute[71428]: pseries-hirsute Apr 23 03:42:11 user nova-compute[71428]: pseries-3.1 Apr 23 03:42:11 user nova-compute[71428]: pseries-5.1 Apr 23 03:42:11 user nova-compute[71428]: pseries-eoan Apr 23 03:42:11 user nova-compute[71428]: pseries-2.9 Apr 23 03:42:11 user nova-compute[71428]: pseries-zesty Apr 23 03:42:11 user nova-compute[71428]: bamboo Apr 23 03:42:11 user nova-compute[71428]: pseries-groovy Apr 23 03:42:11 user nova-compute[71428]: pseries-focal Apr 23 03:42:11 user nova-compute[71428]: g3beige Apr 23 03:42:11 user nova-compute[71428]: pseries-5.2 Apr 23 03:42:11 user nova-compute[71428]: pseries-disco Apr 23 03:42:11 user nova-compute[71428]: pseries-2.12-sxxm Apr 23 03:42:11 user nova-compute[71428]: pseries-2.10 Apr 23 03:42:11 user nova-compute[71428]: virtex-ml507 Apr 23 03:42:11 user nova-compute[71428]: pseries-2.11 Apr 23 03:42:11 user nova-compute[71428]: pseries-2.1 Apr 23 03:42:11 user nova-compute[71428]: pseries-cosmic Apr 23 03:42:11 user nova-compute[71428]: pseries-bionic Apr 23 03:42:11 user nova-compute[71428]: pseries-2.12 Apr 23 03:42:11 user nova-compute[71428]: pseries-2.2 Apr 23 03:42:11 user nova-compute[71428]: mac99 Apr 23 03:42:11 user nova-compute[71428]: pseries-impish Apr 23 03:42:11 user nova-compute[71428]: pseries-artful Apr 23 03:42:11 user nova-compute[71428]: sam460ex Apr 23 03:42:11 user nova-compute[71428]: ref405ep Apr 23 03:42:11 user nova-compute[71428]: pseries-2.3 Apr 23 03:42:11 user nova-compute[71428]: powernv8 Apr 23 03:42:11 user nova-compute[71428]: pseries-4.0 Apr 23 03:42:11 user nova-compute[71428]: pseries-6.0 Apr 23 03:42:11 user nova-compute[71428]: pseries-2.4 Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: hvm Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: 64 Apr 23 03:42:11 user nova-compute[71428]: /usr/bin/qemu-system-ppc64le Apr 23 03:42:11 user nova-compute[71428]: pseries-jammy Apr 23 03:42:11 user nova-compute[71428]: pseries Apr 23 03:42:11 user nova-compute[71428]: powernv9 Apr 23 03:42:11 user nova-compute[71428]: powernv Apr 23 03:42:11 user nova-compute[71428]: taihu Apr 23 03:42:11 user nova-compute[71428]: pseries-4.1 Apr 23 03:42:11 user nova-compute[71428]: mpc8544ds Apr 23 03:42:11 user nova-compute[71428]: pseries-6.1 Apr 23 03:42:11 user nova-compute[71428]: pseries-2.5 Apr 23 03:42:11 user nova-compute[71428]: powernv10 Apr 23 03:42:11 user nova-compute[71428]: pseries-xenial Apr 23 03:42:11 user nova-compute[71428]: pseries-4.2 Apr 23 03:42:11 user nova-compute[71428]: pseries-6.2 Apr 23 03:42:11 user nova-compute[71428]: pseries-yakkety Apr 23 03:42:11 user nova-compute[71428]: pseries-2.6 Apr 23 03:42:11 user nova-compute[71428]: ppce500 Apr 23 03:42:11 user nova-compute[71428]: pseries-bionic-sxxm Apr 23 03:42:11 user nova-compute[71428]: pseries-2.7 Apr 23 03:42:11 user nova-compute[71428]: pseries-3.0 Apr 23 03:42:11 user nova-compute[71428]: pseries-5.0 Apr 23 03:42:11 user nova-compute[71428]: 40p Apr 23 03:42:11 user nova-compute[71428]: pseries-2.8 Apr 23 03:42:11 user nova-compute[71428]: pegasos2 Apr 23 03:42:11 user nova-compute[71428]: pseries-hirsute Apr 23 03:42:11 user nova-compute[71428]: pseries-3.1 Apr 23 03:42:11 user nova-compute[71428]: pseries-5.1 Apr 23 03:42:11 user nova-compute[71428]: pseries-eoan Apr 23 03:42:11 user nova-compute[71428]: pseries-2.9 Apr 23 03:42:11 user nova-compute[71428]: pseries-zesty Apr 23 03:42:11 user nova-compute[71428]: bamboo Apr 23 03:42:11 user nova-compute[71428]: pseries-groovy Apr 23 03:42:11 user nova-compute[71428]: pseries-focal Apr 23 03:42:11 user nova-compute[71428]: g3beige Apr 23 03:42:11 user nova-compute[71428]: pseries-5.2 Apr 23 03:42:11 user nova-compute[71428]: pseries-disco Apr 23 03:42:11 user nova-compute[71428]: pseries-2.12-sxxm Apr 23 03:42:11 user nova-compute[71428]: pseries-2.10 Apr 23 03:42:11 user nova-compute[71428]: virtex-ml507 Apr 23 03:42:11 user nova-compute[71428]: pseries-2.11 Apr 23 03:42:11 user nova-compute[71428]: pseries-2.1 Apr 23 03:42:11 user nova-compute[71428]: pseries-cosmic Apr 23 03:42:11 user nova-compute[71428]: pseries-bionic Apr 23 03:42:11 user nova-compute[71428]: pseries-2.12 Apr 23 03:42:11 user nova-compute[71428]: pseries-2.2 Apr 23 03:42:11 user nova-compute[71428]: mac99 Apr 23 03:42:11 user nova-compute[71428]: pseries-impish Apr 23 03:42:11 user nova-compute[71428]: pseries-artful Apr 23 03:42:11 user nova-compute[71428]: sam460ex Apr 23 03:42:11 user nova-compute[71428]: ref405ep Apr 23 03:42:11 user nova-compute[71428]: pseries-2.3 Apr 23 03:42:11 user nova-compute[71428]: powernv8 Apr 23 03:42:11 user nova-compute[71428]: pseries-4.0 Apr 23 03:42:11 user nova-compute[71428]: pseries-6.0 Apr 23 03:42:11 user nova-compute[71428]: pseries-2.4 Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: hvm Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: 32 Apr 23 03:42:11 user nova-compute[71428]: /usr/bin/qemu-system-riscv32 Apr 23 03:42:11 user nova-compute[71428]: spike Apr 23 03:42:11 user nova-compute[71428]: opentitan Apr 23 03:42:11 user nova-compute[71428]: sifive_u Apr 23 03:42:11 user nova-compute[71428]: sifive_e Apr 23 03:42:11 user nova-compute[71428]: virt Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: hvm Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: 64 Apr 23 03:42:11 user nova-compute[71428]: /usr/bin/qemu-system-riscv64 Apr 23 03:42:11 user nova-compute[71428]: spike Apr 23 03:42:11 user nova-compute[71428]: microchip-icicle-kit Apr 23 03:42:11 user nova-compute[71428]: sifive_u Apr 23 03:42:11 user nova-compute[71428]: shakti_c Apr 23 03:42:11 user nova-compute[71428]: sifive_e Apr 23 03:42:11 user nova-compute[71428]: virt Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: hvm Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: 64 Apr 23 03:42:11 user nova-compute[71428]: /usr/bin/qemu-system-s390x Apr 23 03:42:11 user nova-compute[71428]: s390-ccw-virtio-jammy Apr 23 03:42:11 user nova-compute[71428]: s390-ccw-virtio Apr 23 03:42:11 user nova-compute[71428]: s390-ccw-virtio-4.0 Apr 23 03:42:11 user nova-compute[71428]: s390-ccw-virtio-5.2 Apr 23 03:42:11 user nova-compute[71428]: s390-ccw-virtio-artful Apr 23 03:42:11 user nova-compute[71428]: s390-ccw-virtio-3.1 Apr 23 03:42:11 user nova-compute[71428]: s390-ccw-virtio-groovy Apr 23 03:42:11 user nova-compute[71428]: s390-ccw-virtio-hirsute Apr 23 03:42:11 user nova-compute[71428]: s390-ccw-virtio-disco Apr 23 03:42:11 user nova-compute[71428]: s390-ccw-virtio-2.12 Apr 23 03:42:11 user nova-compute[71428]: s390-ccw-virtio-2.6 Apr 23 03:42:11 user nova-compute[71428]: s390-ccw-virtio-yakkety Apr 23 03:42:11 user nova-compute[71428]: s390-ccw-virtio-eoan Apr 23 03:42:11 user nova-compute[71428]: s390-ccw-virtio-2.9 Apr 23 03:42:11 user nova-compute[71428]: s390-ccw-virtio-6.0 Apr 23 03:42:11 user nova-compute[71428]: s390-ccw-virtio-5.1 Apr 23 03:42:11 user nova-compute[71428]: s390-ccw-virtio-3.0 Apr 23 03:42:11 user nova-compute[71428]: s390-ccw-virtio-4.2 Apr 23 03:42:11 user nova-compute[71428]: s390-ccw-virtio-2.5 Apr 23 03:42:11 user nova-compute[71428]: s390-ccw-virtio-2.11 Apr 23 03:42:11 user nova-compute[71428]: s390-ccw-virtio-xenial Apr 23 03:42:11 user nova-compute[71428]: s390-ccw-virtio-focal Apr 23 03:42:11 user nova-compute[71428]: s390-ccw-virtio-2.8 Apr 23 03:42:11 user nova-compute[71428]: s390-ccw-virtio-impish Apr 23 03:42:11 user nova-compute[71428]: s390-ccw-virtio-bionic Apr 23 03:42:11 user nova-compute[71428]: s390-ccw-virtio-5.0 Apr 23 03:42:11 user nova-compute[71428]: s390-ccw-virtio-6.2 Apr 23 03:42:11 user nova-compute[71428]: s390-ccw-virtio-zesty Apr 23 03:42:11 user nova-compute[71428]: s390-ccw-virtio-4.1 Apr 23 03:42:11 user nova-compute[71428]: s390-ccw-virtio-cosmic Apr 23 03:42:11 user nova-compute[71428]: s390-ccw-virtio-2.4 Apr 23 03:42:11 user nova-compute[71428]: s390-ccw-virtio-2.10 Apr 23 03:42:11 user nova-compute[71428]: s390-ccw-virtio-2.7 Apr 23 03:42:11 user nova-compute[71428]: s390-ccw-virtio-6.1 Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: hvm Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: 32 Apr 23 03:42:11 user nova-compute[71428]: /usr/bin/qemu-system-sh4 Apr 23 03:42:11 user nova-compute[71428]: shix Apr 23 03:42:11 user nova-compute[71428]: r2d Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: hvm Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: 64 Apr 23 03:42:11 user nova-compute[71428]: /usr/bin/qemu-system-sh4eb Apr 23 03:42:11 user nova-compute[71428]: shix Apr 23 03:42:11 user nova-compute[71428]: r2d Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: hvm Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: 32 Apr 23 03:42:11 user nova-compute[71428]: /usr/bin/qemu-system-sparc Apr 23 03:42:11 user nova-compute[71428]: SS-5 Apr 23 03:42:11 user nova-compute[71428]: SS-20 Apr 23 03:42:11 user nova-compute[71428]: LX Apr 23 03:42:11 user nova-compute[71428]: SPARCClassic Apr 23 03:42:11 user nova-compute[71428]: leon3_generic Apr 23 03:42:11 user nova-compute[71428]: SPARCbook Apr 23 03:42:11 user nova-compute[71428]: SS-4 Apr 23 03:42:11 user nova-compute[71428]: SS-600MP Apr 23 03:42:11 user nova-compute[71428]: SS-10 Apr 23 03:42:11 user nova-compute[71428]: Voyager Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: hvm Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: 64 Apr 23 03:42:11 user nova-compute[71428]: /usr/bin/qemu-system-sparc64 Apr 23 03:42:11 user nova-compute[71428]: sun4u Apr 23 03:42:11 user nova-compute[71428]: niagara Apr 23 03:42:11 user nova-compute[71428]: sun4v Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: hvm Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: 64 Apr 23 03:42:11 user nova-compute[71428]: /usr/bin/qemu-system-x86_64 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-jammy Apr 23 03:42:11 user nova-compute[71428]: ubuntu Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-impish-hpb Apr 23 03:42:11 user nova-compute[71428]: pc-q35-5.2 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-2.12 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-2.0 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-xenial Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-6.2 Apr 23 03:42:11 user nova-compute[71428]: pc Apr 23 03:42:11 user nova-compute[71428]: pc-q35-4.2 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-2.5 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-4.2 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-hirsute Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-focal Apr 23 03:42:11 user nova-compute[71428]: pc-q35-xenial Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-jammy-hpb Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-5.2 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-1.5 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-2.7 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-eoan-hpb Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-zesty Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-disco-hpb Apr 23 03:42:11 user nova-compute[71428]: pc-q35-groovy Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-groovy Apr 23 03:42:11 user nova-compute[71428]: pc-q35-artful Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-trusty Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-2.2 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-focal-hpb Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-eoan-hpb Apr 23 03:42:11 user nova-compute[71428]: pc-q35-bionic-hpb Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-artful Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-2.7 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-6.1 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-yakkety Apr 23 03:42:11 user nova-compute[71428]: pc-q35-2.4 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-cosmic-hpb Apr 23 03:42:11 user nova-compute[71428]: pc-q35-2.10 Apr 23 03:42:11 user nova-compute[71428]: x-remote Apr 23 03:42:11 user nova-compute[71428]: pc-q35-5.1 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-1.7 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-2.9 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-2.11 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-3.1 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-6.1 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-4.1 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-jammy Apr 23 03:42:11 user nova-compute[71428]: ubuntu-q35 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-2.4 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-4.1 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-eoan Apr 23 03:42:11 user nova-compute[71428]: pc-q35-jammy-hpb Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-5.1 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-2.9 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-bionic-hpb Apr 23 03:42:11 user nova-compute[71428]: isapc Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-1.4 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-cosmic Apr 23 03:42:11 user nova-compute[71428]: pc-q35-2.6 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-3.1 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-bionic Apr 23 03:42:11 user nova-compute[71428]: pc-q35-disco-hpb Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-cosmic Apr 23 03:42:11 user nova-compute[71428]: pc-q35-2.12 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-bionic Apr 23 03:42:11 user nova-compute[71428]: pc-q35-groovy-hpb Apr 23 03:42:11 user nova-compute[71428]: pc-q35-disco Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-cosmic-hpb Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-2.1 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-wily Apr 23 03:42:11 user nova-compute[71428]: pc-q35-impish Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-2.6 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-6.0 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-impish Apr 23 03:42:11 user nova-compute[71428]: pc-q35-impish-hpb Apr 23 03:42:11 user nova-compute[71428]: pc-q35-hirsute Apr 23 03:42:11 user nova-compute[71428]: pc-q35-4.0.1 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-hirsute-hpb Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-1.6 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-5.0 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-2.8 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-2.10 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-3.0 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-zesty Apr 23 03:42:11 user nova-compute[71428]: pc-q35-4.0 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-focal Apr 23 03:42:11 user nova-compute[71428]: microvm Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-6.0 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-2.3 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-disco Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-focal-hpb Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-4.0 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-groovy-hpb Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-hirsute-hpb Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-5.0 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-2.8 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-6.2 Apr 23 03:42:11 user nova-compute[71428]: q35 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-eoan Apr 23 03:42:11 user nova-compute[71428]: pc-q35-2.5 Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-3.0 Apr 23 03:42:11 user nova-compute[71428]: pc-q35-yakkety Apr 23 03:42:11 user nova-compute[71428]: pc-q35-2.11 Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: hvm Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: 32 Apr 23 03:42:11 user nova-compute[71428]: /usr/bin/qemu-system-xtensa Apr 23 03:42:11 user nova-compute[71428]: sim Apr 23 03:42:11 user nova-compute[71428]: kc705 Apr 23 03:42:11 user nova-compute[71428]: ml605 Apr 23 03:42:11 user nova-compute[71428]: ml605-nommu Apr 23 03:42:11 user nova-compute[71428]: virt Apr 23 03:42:11 user nova-compute[71428]: lx60-nommu Apr 23 03:42:11 user nova-compute[71428]: lx200 Apr 23 03:42:11 user nova-compute[71428]: lx200-nommu Apr 23 03:42:11 user nova-compute[71428]: lx60 Apr 23 03:42:11 user nova-compute[71428]: kc705-nommu Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: hvm Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: 32 Apr 23 03:42:11 user nova-compute[71428]: /usr/bin/qemu-system-xtensaeb Apr 23 03:42:11 user nova-compute[71428]: sim Apr 23 03:42:11 user nova-compute[71428]: kc705 Apr 23 03:42:11 user nova-compute[71428]: ml605 Apr 23 03:42:11 user nova-compute[71428]: ml605-nommu Apr 23 03:42:11 user nova-compute[71428]: virt Apr 23 03:42:11 user nova-compute[71428]: lx60-nommu Apr 23 03:42:11 user nova-compute[71428]: lx200 Apr 23 03:42:11 user nova-compute[71428]: lx200-nommu Apr 23 03:42:11 user nova-compute[71428]: lx60 Apr 23 03:42:11 user nova-compute[71428]: kc705-nommu Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Getting domain capabilities for alpha via machine types: {None} {{(pid=71428) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Error from libvirt when retrieving domain capabilities for arch alpha / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-alpha' on this host {{(pid=71428) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Getting domain capabilities for armv6l via machine types: {'virt', None} {{(pid=71428) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Error from libvirt when retrieving domain capabilities for arch armv6l / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-arm' on this host {{(pid=71428) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Error from libvirt when retrieving domain capabilities for arch armv6l / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-arm' on this host {{(pid=71428) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Getting domain capabilities for armv7l via machine types: {'virt'} {{(pid=71428) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Error from libvirt when retrieving domain capabilities for arch armv7l / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-arm' on this host {{(pid=71428) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Getting domain capabilities for aarch64 via machine types: {'virt'} {{(pid=71428) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Error from libvirt when retrieving domain capabilities for arch aarch64 / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-aarch64' on this host {{(pid=71428) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Getting domain capabilities for cris via machine types: {None} {{(pid=71428) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Error from libvirt when retrieving domain capabilities for arch cris / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-cris' on this host {{(pid=71428) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Getting domain capabilities for i686 via machine types: {'ubuntu-q35', 'pc', 'ubuntu', 'q35'} {{(pid=71428) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=ubuntu-q35: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: /usr/bin/qemu-system-i386 Apr 23 03:42:11 user nova-compute[71428]: kvm Apr 23 03:42:11 user nova-compute[71428]: pc-q35-jammy Apr 23 03:42:11 user nova-compute[71428]: i686 Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: /usr/share/OVMF/OVMF_CODE.fd Apr 23 03:42:11 user nova-compute[71428]: /usr/share/OVMF/OVMF_CODE.secboot.fd Apr 23 03:42:11 user nova-compute[71428]: /usr/share/AAVMF/AAVMF_CODE.fd Apr 23 03:42:11 user nova-compute[71428]: /usr/share/AAVMF/AAVMF32_CODE.fd Apr 23 03:42:11 user nova-compute[71428]: /usr/share/OVMF/OVMF_CODE.ms.fd Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: rom Apr 23 03:42:11 user nova-compute[71428]: pflash Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: yes Apr 23 03:42:11 user nova-compute[71428]: no Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: no Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: off Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: on Apr 23 03:42:11 user nova-compute[71428]: off Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: IvyBridge-IBRS Apr 23 03:42:11 user nova-compute[71428]: Intel Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: qemu64 Apr 23 03:42:11 user nova-compute[71428]: qemu32 Apr 23 03:42:11 user nova-compute[71428]: phenom Apr 23 03:42:11 user nova-compute[71428]: pentium3 Apr 23 03:42:11 user nova-compute[71428]: pentium2 Apr 23 03:42:11 user nova-compute[71428]: pentium Apr 23 03:42:11 user nova-compute[71428]: n270 Apr 23 03:42:11 user nova-compute[71428]: kvm64 Apr 23 03:42:11 user nova-compute[71428]: kvm32 Apr 23 03:42:11 user nova-compute[71428]: coreduo Apr 23 03:42:11 user nova-compute[71428]: core2duo Apr 23 03:42:11 user nova-compute[71428]: athlon Apr 23 03:42:11 user nova-compute[71428]: Westmere-IBRS Apr 23 03:42:11 user nova-compute[71428]: Westmere Apr 23 03:42:11 user nova-compute[71428]: Snowridge Apr 23 03:42:11 user nova-compute[71428]: Skylake-Server-noTSX-IBRS Apr 23 03:42:11 user nova-compute[71428]: Skylake-Server-IBRS Apr 23 03:42:11 user nova-compute[71428]: Skylake-Server Apr 23 03:42:11 user nova-compute[71428]: Skylake-Client-noTSX-IBRS Apr 23 03:42:11 user nova-compute[71428]: Skylake-Client-IBRS Apr 23 03:42:11 user nova-compute[71428]: Skylake-Client Apr 23 03:42:11 user nova-compute[71428]: SandyBridge-IBRS Apr 23 03:42:11 user nova-compute[71428]: SandyBridge Apr 23 03:42:11 user nova-compute[71428]: Penryn Apr 23 03:42:11 user nova-compute[71428]: Opteron_G5 Apr 23 03:42:11 user nova-compute[71428]: Opteron_G4 Apr 23 03:42:11 user nova-compute[71428]: Opteron_G3 Apr 23 03:42:11 user nova-compute[71428]: Opteron_G2 Apr 23 03:42:11 user nova-compute[71428]: Opteron_G1 Apr 23 03:42:11 user nova-compute[71428]: Nehalem-IBRS Apr 23 03:42:11 user nova-compute[71428]: Nehalem Apr 23 03:42:11 user nova-compute[71428]: IvyBridge-IBRS Apr 23 03:42:11 user nova-compute[71428]: IvyBridge Apr 23 03:42:11 user nova-compute[71428]: Icelake-Server-noTSX Apr 23 03:42:11 user nova-compute[71428]: Icelake-Server Apr 23 03:42:11 user nova-compute[71428]: Icelake-Client-noTSX Apr 23 03:42:11 user nova-compute[71428]: Icelake-Client Apr 23 03:42:11 user nova-compute[71428]: Haswell-noTSX-IBRS Apr 23 03:42:11 user nova-compute[71428]: Haswell-noTSX Apr 23 03:42:11 user nova-compute[71428]: Haswell-IBRS Apr 23 03:42:11 user nova-compute[71428]: Haswell Apr 23 03:42:11 user nova-compute[71428]: EPYC-Rome Apr 23 03:42:11 user nova-compute[71428]: EPYC-Milan Apr 23 03:42:11 user nova-compute[71428]: EPYC-IBPB Apr 23 03:42:11 user nova-compute[71428]: EPYC Apr 23 03:42:11 user nova-compute[71428]: Dhyana Apr 23 03:42:11 user nova-compute[71428]: Cooperlake Apr 23 03:42:11 user nova-compute[71428]: Conroe Apr 23 03:42:11 user nova-compute[71428]: Cascadelake-Server-noTSX Apr 23 03:42:11 user nova-compute[71428]: Cascadelake-Server Apr 23 03:42:11 user nova-compute[71428]: Broadwell-noTSX-IBRS Apr 23 03:42:11 user nova-compute[71428]: Broadwell-noTSX Apr 23 03:42:11 user nova-compute[71428]: Broadwell-IBRS Apr 23 03:42:11 user nova-compute[71428]: Broadwell Apr 23 03:42:11 user nova-compute[71428]: 486 Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: file Apr 23 03:42:11 user nova-compute[71428]: anonymous Apr 23 03:42:11 user nova-compute[71428]: memfd Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: disk Apr 23 03:42:11 user nova-compute[71428]: cdrom Apr 23 03:42:11 user nova-compute[71428]: floppy Apr 23 03:42:11 user nova-compute[71428]: lun Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: fdc Apr 23 03:42:11 user nova-compute[71428]: scsi Apr 23 03:42:11 user nova-compute[71428]: virtio Apr 23 03:42:11 user nova-compute[71428]: usb Apr 23 03:42:11 user nova-compute[71428]: sata Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: virtio Apr 23 03:42:11 user nova-compute[71428]: virtio-transitional Apr 23 03:42:11 user nova-compute[71428]: virtio-non-transitional Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: sdl Apr 23 03:42:11 user nova-compute[71428]: vnc Apr 23 03:42:11 user nova-compute[71428]: spice Apr 23 03:42:11 user nova-compute[71428]: egl-headless Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: subsystem Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: default Apr 23 03:42:11 user nova-compute[71428]: mandatory Apr 23 03:42:11 user nova-compute[71428]: requisite Apr 23 03:42:11 user nova-compute[71428]: optional Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: usb Apr 23 03:42:11 user nova-compute[71428]: pci Apr 23 03:42:11 user nova-compute[71428]: scsi Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: virtio Apr 23 03:42:11 user nova-compute[71428]: virtio-transitional Apr 23 03:42:11 user nova-compute[71428]: virtio-non-transitional Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: random Apr 23 03:42:11 user nova-compute[71428]: egd Apr 23 03:42:11 user nova-compute[71428]: builtin Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: path Apr 23 03:42:11 user nova-compute[71428]: handle Apr 23 03:42:11 user nova-compute[71428]: virtiofs Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: tpm-tis Apr 23 03:42:11 user nova-compute[71428]: tpm-crb Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: passthrough Apr 23 03:42:11 user nova-compute[71428]: emulator Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: {{(pid=71428) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: /usr/bin/qemu-system-i386 Apr 23 03:42:11 user nova-compute[71428]: kvm Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-6.2 Apr 23 03:42:11 user nova-compute[71428]: i686 Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: /usr/share/OVMF/OVMF_CODE.fd Apr 23 03:42:11 user nova-compute[71428]: /usr/share/OVMF/OVMF_CODE.secboot.fd Apr 23 03:42:11 user nova-compute[71428]: /usr/share/AAVMF/AAVMF_CODE.fd Apr 23 03:42:11 user nova-compute[71428]: /usr/share/AAVMF/AAVMF32_CODE.fd Apr 23 03:42:11 user nova-compute[71428]: /usr/share/OVMF/OVMF_CODE.ms.fd Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: rom Apr 23 03:42:11 user nova-compute[71428]: pflash Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: yes Apr 23 03:42:11 user nova-compute[71428]: no Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: no Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: off Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: on Apr 23 03:42:11 user nova-compute[71428]: off Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: IvyBridge-IBRS Apr 23 03:42:11 user nova-compute[71428]: Intel Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: qemu64 Apr 23 03:42:11 user nova-compute[71428]: qemu32 Apr 23 03:42:11 user nova-compute[71428]: phenom Apr 23 03:42:11 user nova-compute[71428]: pentium3 Apr 23 03:42:11 user nova-compute[71428]: pentium2 Apr 23 03:42:11 user nova-compute[71428]: pentium Apr 23 03:42:11 user nova-compute[71428]: n270 Apr 23 03:42:11 user nova-compute[71428]: kvm64 Apr 23 03:42:11 user nova-compute[71428]: kvm32 Apr 23 03:42:11 user nova-compute[71428]: coreduo Apr 23 03:42:11 user nova-compute[71428]: core2duo Apr 23 03:42:11 user nova-compute[71428]: athlon Apr 23 03:42:11 user nova-compute[71428]: Westmere-IBRS Apr 23 03:42:11 user nova-compute[71428]: Westmere Apr 23 03:42:11 user nova-compute[71428]: Snowridge Apr 23 03:42:11 user nova-compute[71428]: Skylake-Server-noTSX-IBRS Apr 23 03:42:11 user nova-compute[71428]: Skylake-Server-IBRS Apr 23 03:42:11 user nova-compute[71428]: Skylake-Server Apr 23 03:42:11 user nova-compute[71428]: Skylake-Client-noTSX-IBRS Apr 23 03:42:11 user nova-compute[71428]: Skylake-Client-IBRS Apr 23 03:42:11 user nova-compute[71428]: Skylake-Client Apr 23 03:42:11 user nova-compute[71428]: SandyBridge-IBRS Apr 23 03:42:11 user nova-compute[71428]: SandyBridge Apr 23 03:42:11 user nova-compute[71428]: Penryn Apr 23 03:42:11 user nova-compute[71428]: Opteron_G5 Apr 23 03:42:11 user nova-compute[71428]: Opteron_G4 Apr 23 03:42:11 user nova-compute[71428]: Opteron_G3 Apr 23 03:42:11 user nova-compute[71428]: Opteron_G2 Apr 23 03:42:11 user nova-compute[71428]: Opteron_G1 Apr 23 03:42:11 user nova-compute[71428]: Nehalem-IBRS Apr 23 03:42:11 user nova-compute[71428]: Nehalem Apr 23 03:42:11 user nova-compute[71428]: IvyBridge-IBRS Apr 23 03:42:11 user nova-compute[71428]: IvyBridge Apr 23 03:42:11 user nova-compute[71428]: Icelake-Server-noTSX Apr 23 03:42:11 user nova-compute[71428]: Icelake-Server Apr 23 03:42:11 user nova-compute[71428]: Icelake-Client-noTSX Apr 23 03:42:11 user nova-compute[71428]: Icelake-Client Apr 23 03:42:11 user nova-compute[71428]: Haswell-noTSX-IBRS Apr 23 03:42:11 user nova-compute[71428]: Haswell-noTSX Apr 23 03:42:11 user nova-compute[71428]: Haswell-IBRS Apr 23 03:42:11 user nova-compute[71428]: Haswell Apr 23 03:42:11 user nova-compute[71428]: EPYC-Rome Apr 23 03:42:11 user nova-compute[71428]: EPYC-Milan Apr 23 03:42:11 user nova-compute[71428]: EPYC-IBPB Apr 23 03:42:11 user nova-compute[71428]: EPYC Apr 23 03:42:11 user nova-compute[71428]: Dhyana Apr 23 03:42:11 user nova-compute[71428]: Cooperlake Apr 23 03:42:11 user nova-compute[71428]: Conroe Apr 23 03:42:11 user nova-compute[71428]: Cascadelake-Server-noTSX Apr 23 03:42:11 user nova-compute[71428]: Cascadelake-Server Apr 23 03:42:11 user nova-compute[71428]: Broadwell-noTSX-IBRS Apr 23 03:42:11 user nova-compute[71428]: Broadwell-noTSX Apr 23 03:42:11 user nova-compute[71428]: Broadwell-IBRS Apr 23 03:42:11 user nova-compute[71428]: Broadwell Apr 23 03:42:11 user nova-compute[71428]: 486 Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: file Apr 23 03:42:11 user nova-compute[71428]: anonymous Apr 23 03:42:11 user nova-compute[71428]: memfd Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: disk Apr 23 03:42:11 user nova-compute[71428]: cdrom Apr 23 03:42:11 user nova-compute[71428]: floppy Apr 23 03:42:11 user nova-compute[71428]: lun Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: ide Apr 23 03:42:11 user nova-compute[71428]: fdc Apr 23 03:42:11 user nova-compute[71428]: scsi Apr 23 03:42:11 user nova-compute[71428]: virtio Apr 23 03:42:11 user nova-compute[71428]: usb Apr 23 03:42:11 user nova-compute[71428]: sata Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: virtio Apr 23 03:42:11 user nova-compute[71428]: virtio-transitional Apr 23 03:42:11 user nova-compute[71428]: virtio-non-transitional Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: sdl Apr 23 03:42:11 user nova-compute[71428]: vnc Apr 23 03:42:11 user nova-compute[71428]: spice Apr 23 03:42:11 user nova-compute[71428]: egl-headless Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: subsystem Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: default Apr 23 03:42:11 user nova-compute[71428]: mandatory Apr 23 03:42:11 user nova-compute[71428]: requisite Apr 23 03:42:11 user nova-compute[71428]: optional Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: usb Apr 23 03:42:11 user nova-compute[71428]: pci Apr 23 03:42:11 user nova-compute[71428]: scsi Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: virtio Apr 23 03:42:11 user nova-compute[71428]: virtio-transitional Apr 23 03:42:11 user nova-compute[71428]: virtio-non-transitional Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: random Apr 23 03:42:11 user nova-compute[71428]: egd Apr 23 03:42:11 user nova-compute[71428]: builtin Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: path Apr 23 03:42:11 user nova-compute[71428]: handle Apr 23 03:42:11 user nova-compute[71428]: virtiofs Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: tpm-tis Apr 23 03:42:11 user nova-compute[71428]: tpm-crb Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: passthrough Apr 23 03:42:11 user nova-compute[71428]: emulator Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: {{(pid=71428) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=ubuntu: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: /usr/bin/qemu-system-i386 Apr 23 03:42:11 user nova-compute[71428]: kvm Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-jammy Apr 23 03:42:11 user nova-compute[71428]: i686 Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: /usr/share/OVMF/OVMF_CODE.fd Apr 23 03:42:11 user nova-compute[71428]: /usr/share/OVMF/OVMF_CODE.secboot.fd Apr 23 03:42:11 user nova-compute[71428]: /usr/share/AAVMF/AAVMF_CODE.fd Apr 23 03:42:11 user nova-compute[71428]: /usr/share/AAVMF/AAVMF32_CODE.fd Apr 23 03:42:11 user nova-compute[71428]: /usr/share/OVMF/OVMF_CODE.ms.fd Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: rom Apr 23 03:42:11 user nova-compute[71428]: pflash Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: yes Apr 23 03:42:11 user nova-compute[71428]: no Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: no Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: off Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: on Apr 23 03:42:11 user nova-compute[71428]: off Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: IvyBridge-IBRS Apr 23 03:42:11 user nova-compute[71428]: Intel Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: qemu64 Apr 23 03:42:11 user nova-compute[71428]: qemu32 Apr 23 03:42:11 user nova-compute[71428]: phenom Apr 23 03:42:11 user nova-compute[71428]: pentium3 Apr 23 03:42:11 user nova-compute[71428]: pentium2 Apr 23 03:42:11 user nova-compute[71428]: pentium Apr 23 03:42:11 user nova-compute[71428]: n270 Apr 23 03:42:11 user nova-compute[71428]: kvm64 Apr 23 03:42:11 user nova-compute[71428]: kvm32 Apr 23 03:42:11 user nova-compute[71428]: coreduo Apr 23 03:42:11 user nova-compute[71428]: core2duo Apr 23 03:42:11 user nova-compute[71428]: athlon Apr 23 03:42:11 user nova-compute[71428]: Westmere-IBRS Apr 23 03:42:11 user nova-compute[71428]: Westmere Apr 23 03:42:11 user nova-compute[71428]: Snowridge Apr 23 03:42:11 user nova-compute[71428]: Skylake-Server-noTSX-IBRS Apr 23 03:42:11 user nova-compute[71428]: Skylake-Server-IBRS Apr 23 03:42:11 user nova-compute[71428]: Skylake-Server Apr 23 03:42:11 user nova-compute[71428]: Skylake-Client-noTSX-IBRS Apr 23 03:42:11 user nova-compute[71428]: Skylake-Client-IBRS Apr 23 03:42:11 user nova-compute[71428]: Skylake-Client Apr 23 03:42:11 user nova-compute[71428]: SandyBridge-IBRS Apr 23 03:42:11 user nova-compute[71428]: SandyBridge Apr 23 03:42:11 user nova-compute[71428]: Penryn Apr 23 03:42:11 user nova-compute[71428]: Opteron_G5 Apr 23 03:42:11 user nova-compute[71428]: Opteron_G4 Apr 23 03:42:11 user nova-compute[71428]: Opteron_G3 Apr 23 03:42:11 user nova-compute[71428]: Opteron_G2 Apr 23 03:42:11 user nova-compute[71428]: Opteron_G1 Apr 23 03:42:11 user nova-compute[71428]: Nehalem-IBRS Apr 23 03:42:11 user nova-compute[71428]: Nehalem Apr 23 03:42:11 user nova-compute[71428]: IvyBridge-IBRS Apr 23 03:42:11 user nova-compute[71428]: IvyBridge Apr 23 03:42:11 user nova-compute[71428]: Icelake-Server-noTSX Apr 23 03:42:11 user nova-compute[71428]: Icelake-Server Apr 23 03:42:11 user nova-compute[71428]: Icelake-Client-noTSX Apr 23 03:42:11 user nova-compute[71428]: Icelake-Client Apr 23 03:42:11 user nova-compute[71428]: Haswell-noTSX-IBRS Apr 23 03:42:11 user nova-compute[71428]: Haswell-noTSX Apr 23 03:42:11 user nova-compute[71428]: Haswell-IBRS Apr 23 03:42:11 user nova-compute[71428]: Haswell Apr 23 03:42:11 user nova-compute[71428]: EPYC-Rome Apr 23 03:42:11 user nova-compute[71428]: EPYC-Milan Apr 23 03:42:11 user nova-compute[71428]: EPYC-IBPB Apr 23 03:42:11 user nova-compute[71428]: EPYC Apr 23 03:42:11 user nova-compute[71428]: Dhyana Apr 23 03:42:11 user nova-compute[71428]: Cooperlake Apr 23 03:42:11 user nova-compute[71428]: Conroe Apr 23 03:42:11 user nova-compute[71428]: Cascadelake-Server-noTSX Apr 23 03:42:11 user nova-compute[71428]: Cascadelake-Server Apr 23 03:42:11 user nova-compute[71428]: Broadwell-noTSX-IBRS Apr 23 03:42:11 user nova-compute[71428]: Broadwell-noTSX Apr 23 03:42:11 user nova-compute[71428]: Broadwell-IBRS Apr 23 03:42:11 user nova-compute[71428]: Broadwell Apr 23 03:42:11 user nova-compute[71428]: 486 Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: file Apr 23 03:42:11 user nova-compute[71428]: anonymous Apr 23 03:42:11 user nova-compute[71428]: memfd Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: disk Apr 23 03:42:11 user nova-compute[71428]: cdrom Apr 23 03:42:11 user nova-compute[71428]: floppy Apr 23 03:42:11 user nova-compute[71428]: lun Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: ide Apr 23 03:42:11 user nova-compute[71428]: fdc Apr 23 03:42:11 user nova-compute[71428]: scsi Apr 23 03:42:11 user nova-compute[71428]: virtio Apr 23 03:42:11 user nova-compute[71428]: usb Apr 23 03:42:11 user nova-compute[71428]: sata Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: virtio Apr 23 03:42:11 user nova-compute[71428]: virtio-transitional Apr 23 03:42:11 user nova-compute[71428]: virtio-non-transitional Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: sdl Apr 23 03:42:11 user nova-compute[71428]: vnc Apr 23 03:42:11 user nova-compute[71428]: spice Apr 23 03:42:11 user nova-compute[71428]: egl-headless Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: subsystem Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: default Apr 23 03:42:11 user nova-compute[71428]: mandatory Apr 23 03:42:11 user nova-compute[71428]: requisite Apr 23 03:42:11 user nova-compute[71428]: optional Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: usb Apr 23 03:42:11 user nova-compute[71428]: pci Apr 23 03:42:11 user nova-compute[71428]: scsi Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: virtio Apr 23 03:42:11 user nova-compute[71428]: virtio-transitional Apr 23 03:42:11 user nova-compute[71428]: virtio-non-transitional Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: random Apr 23 03:42:11 user nova-compute[71428]: egd Apr 23 03:42:11 user nova-compute[71428]: builtin Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: path Apr 23 03:42:11 user nova-compute[71428]: handle Apr 23 03:42:11 user nova-compute[71428]: virtiofs Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: tpm-tis Apr 23 03:42:11 user nova-compute[71428]: tpm-crb Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: passthrough Apr 23 03:42:11 user nova-compute[71428]: emulator Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: {{(pid=71428) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: /usr/bin/qemu-system-i386 Apr 23 03:42:11 user nova-compute[71428]: kvm Apr 23 03:42:11 user nova-compute[71428]: pc-q35-6.2 Apr 23 03:42:11 user nova-compute[71428]: i686 Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: /usr/share/OVMF/OVMF_CODE.fd Apr 23 03:42:11 user nova-compute[71428]: /usr/share/OVMF/OVMF_CODE.secboot.fd Apr 23 03:42:11 user nova-compute[71428]: /usr/share/AAVMF/AAVMF_CODE.fd Apr 23 03:42:11 user nova-compute[71428]: /usr/share/AAVMF/AAVMF32_CODE.fd Apr 23 03:42:11 user nova-compute[71428]: /usr/share/OVMF/OVMF_CODE.ms.fd Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: rom Apr 23 03:42:11 user nova-compute[71428]: pflash Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: yes Apr 23 03:42:11 user nova-compute[71428]: no Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: no Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: off Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: on Apr 23 03:42:11 user nova-compute[71428]: off Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: IvyBridge-IBRS Apr 23 03:42:11 user nova-compute[71428]: Intel Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: qemu64 Apr 23 03:42:11 user nova-compute[71428]: qemu32 Apr 23 03:42:11 user nova-compute[71428]: phenom Apr 23 03:42:11 user nova-compute[71428]: pentium3 Apr 23 03:42:11 user nova-compute[71428]: pentium2 Apr 23 03:42:11 user nova-compute[71428]: pentium Apr 23 03:42:11 user nova-compute[71428]: n270 Apr 23 03:42:11 user nova-compute[71428]: kvm64 Apr 23 03:42:11 user nova-compute[71428]: kvm32 Apr 23 03:42:11 user nova-compute[71428]: coreduo Apr 23 03:42:11 user nova-compute[71428]: core2duo Apr 23 03:42:11 user nova-compute[71428]: athlon Apr 23 03:42:11 user nova-compute[71428]: Westmere-IBRS Apr 23 03:42:11 user nova-compute[71428]: Westmere Apr 23 03:42:11 user nova-compute[71428]: Snowridge Apr 23 03:42:11 user nova-compute[71428]: Skylake-Server-noTSX-IBRS Apr 23 03:42:11 user nova-compute[71428]: Skylake-Server-IBRS Apr 23 03:42:11 user nova-compute[71428]: Skylake-Server Apr 23 03:42:11 user nova-compute[71428]: Skylake-Client-noTSX-IBRS Apr 23 03:42:11 user nova-compute[71428]: Skylake-Client-IBRS Apr 23 03:42:11 user nova-compute[71428]: Skylake-Client Apr 23 03:42:11 user nova-compute[71428]: SandyBridge-IBRS Apr 23 03:42:11 user nova-compute[71428]: SandyBridge Apr 23 03:42:11 user nova-compute[71428]: Penryn Apr 23 03:42:11 user nova-compute[71428]: Opteron_G5 Apr 23 03:42:11 user nova-compute[71428]: Opteron_G4 Apr 23 03:42:11 user nova-compute[71428]: Opteron_G3 Apr 23 03:42:11 user nova-compute[71428]: Opteron_G2 Apr 23 03:42:11 user nova-compute[71428]: Opteron_G1 Apr 23 03:42:11 user nova-compute[71428]: Nehalem-IBRS Apr 23 03:42:11 user nova-compute[71428]: Nehalem Apr 23 03:42:11 user nova-compute[71428]: IvyBridge-IBRS Apr 23 03:42:11 user nova-compute[71428]: IvyBridge Apr 23 03:42:11 user nova-compute[71428]: Icelake-Server-noTSX Apr 23 03:42:11 user nova-compute[71428]: Icelake-Server Apr 23 03:42:11 user nova-compute[71428]: Icelake-Client-noTSX Apr 23 03:42:11 user nova-compute[71428]: Icelake-Client Apr 23 03:42:11 user nova-compute[71428]: Haswell-noTSX-IBRS Apr 23 03:42:11 user nova-compute[71428]: Haswell-noTSX Apr 23 03:42:11 user nova-compute[71428]: Haswell-IBRS Apr 23 03:42:11 user nova-compute[71428]: Haswell Apr 23 03:42:11 user nova-compute[71428]: EPYC-Rome Apr 23 03:42:11 user nova-compute[71428]: EPYC-Milan Apr 23 03:42:11 user nova-compute[71428]: EPYC-IBPB Apr 23 03:42:11 user nova-compute[71428]: EPYC Apr 23 03:42:11 user nova-compute[71428]: Dhyana Apr 23 03:42:11 user nova-compute[71428]: Cooperlake Apr 23 03:42:11 user nova-compute[71428]: Conroe Apr 23 03:42:11 user nova-compute[71428]: Cascadelake-Server-noTSX Apr 23 03:42:11 user nova-compute[71428]: Cascadelake-Server Apr 23 03:42:11 user nova-compute[71428]: Broadwell-noTSX-IBRS Apr 23 03:42:11 user nova-compute[71428]: Broadwell-noTSX Apr 23 03:42:11 user nova-compute[71428]: Broadwell-IBRS Apr 23 03:42:11 user nova-compute[71428]: Broadwell Apr 23 03:42:11 user nova-compute[71428]: 486 Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: file Apr 23 03:42:11 user nova-compute[71428]: anonymous Apr 23 03:42:11 user nova-compute[71428]: memfd Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: disk Apr 23 03:42:11 user nova-compute[71428]: cdrom Apr 23 03:42:11 user nova-compute[71428]: floppy Apr 23 03:42:11 user nova-compute[71428]: lun Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: fdc Apr 23 03:42:11 user nova-compute[71428]: scsi Apr 23 03:42:11 user nova-compute[71428]: virtio Apr 23 03:42:11 user nova-compute[71428]: usb Apr 23 03:42:11 user nova-compute[71428]: sata Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: virtio Apr 23 03:42:11 user nova-compute[71428]: virtio-transitional Apr 23 03:42:11 user nova-compute[71428]: virtio-non-transitional Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: sdl Apr 23 03:42:11 user nova-compute[71428]: vnc Apr 23 03:42:11 user nova-compute[71428]: spice Apr 23 03:42:11 user nova-compute[71428]: egl-headless Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: subsystem Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: default Apr 23 03:42:11 user nova-compute[71428]: mandatory Apr 23 03:42:11 user nova-compute[71428]: requisite Apr 23 03:42:11 user nova-compute[71428]: optional Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: usb Apr 23 03:42:11 user nova-compute[71428]: pci Apr 23 03:42:11 user nova-compute[71428]: scsi Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: virtio Apr 23 03:42:11 user nova-compute[71428]: virtio-transitional Apr 23 03:42:11 user nova-compute[71428]: virtio-non-transitional Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: random Apr 23 03:42:11 user nova-compute[71428]: egd Apr 23 03:42:11 user nova-compute[71428]: builtin Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: path Apr 23 03:42:11 user nova-compute[71428]: handle Apr 23 03:42:11 user nova-compute[71428]: virtiofs Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: tpm-tis Apr 23 03:42:11 user nova-compute[71428]: tpm-crb Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: passthrough Apr 23 03:42:11 user nova-compute[71428]: emulator Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: {{(pid=71428) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Getting domain capabilities for m68k via machine types: {'virt', None} {{(pid=71428) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Error from libvirt when retrieving domain capabilities for arch m68k / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-m68k' on this host {{(pid=71428) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Error from libvirt when retrieving domain capabilities for arch m68k / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-m68k' on this host {{(pid=71428) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Getting domain capabilities for microblaze via machine types: {None} {{(pid=71428) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Error from libvirt when retrieving domain capabilities for arch microblaze / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-microblaze' on this host {{(pid=71428) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Getting domain capabilities for microblazeel via machine types: {None} {{(pid=71428) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Error from libvirt when retrieving domain capabilities for arch microblazeel / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-microblazeel' on this host {{(pid=71428) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Getting domain capabilities for mips via machine types: {None} {{(pid=71428) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Error from libvirt when retrieving domain capabilities for arch mips / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mips' on this host {{(pid=71428) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Getting domain capabilities for mipsel via machine types: {None} {{(pid=71428) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Error from libvirt when retrieving domain capabilities for arch mipsel / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mipsel' on this host {{(pid=71428) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Getting domain capabilities for mips64 via machine types: {None} {{(pid=71428) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Error from libvirt when retrieving domain capabilities for arch mips64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mips64' on this host {{(pid=71428) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Getting domain capabilities for mips64el via machine types: {None} {{(pid=71428) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Error from libvirt when retrieving domain capabilities for arch mips64el / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mips64el' on this host {{(pid=71428) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Getting domain capabilities for ppc via machine types: {None} {{(pid=71428) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Error from libvirt when retrieving domain capabilities for arch ppc / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc' on this host {{(pid=71428) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Getting domain capabilities for ppc64 via machine types: {'pseries', 'powernv', None} {{(pid=71428) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Error from libvirt when retrieving domain capabilities for arch ppc64 / virt_type kvm / machine_type pseries: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64' on this host {{(pid=71428) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Error from libvirt when retrieving domain capabilities for arch ppc64 / virt_type kvm / machine_type powernv: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64' on this host {{(pid=71428) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Error from libvirt when retrieving domain capabilities for arch ppc64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64' on this host {{(pid=71428) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Getting domain capabilities for ppc64le via machine types: {'pseries', 'powernv'} {{(pid=71428) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Error from libvirt when retrieving domain capabilities for arch ppc64le / virt_type kvm / machine_type pseries: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64le' on this host {{(pid=71428) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Error from libvirt when retrieving domain capabilities for arch ppc64le / virt_type kvm / machine_type powernv: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64le' on this host {{(pid=71428) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Getting domain capabilities for riscv32 via machine types: {None} {{(pid=71428) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Error from libvirt when retrieving domain capabilities for arch riscv32 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-riscv32' on this host {{(pid=71428) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Getting domain capabilities for riscv64 via machine types: {None} {{(pid=71428) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Error from libvirt when retrieving domain capabilities for arch riscv64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-riscv64' on this host {{(pid=71428) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Getting domain capabilities for s390x via machine types: {'s390-ccw-virtio'} {{(pid=71428) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Error from libvirt when retrieving domain capabilities for arch s390x / virt_type kvm / machine_type s390-ccw-virtio: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-s390x' on this host {{(pid=71428) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Getting domain capabilities for sh4 via machine types: {None} {{(pid=71428) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Error from libvirt when retrieving domain capabilities for arch sh4 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sh4' on this host {{(pid=71428) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Getting domain capabilities for sh4eb via machine types: {None} {{(pid=71428) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Error from libvirt when retrieving domain capabilities for arch sh4eb / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sh4eb' on this host {{(pid=71428) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Getting domain capabilities for sparc via machine types: {None} {{(pid=71428) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Error from libvirt when retrieving domain capabilities for arch sparc / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sparc' on this host {{(pid=71428) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Getting domain capabilities for sparc64 via machine types: {None} {{(pid=71428) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Error from libvirt when retrieving domain capabilities for arch sparc64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sparc64' on this host {{(pid=71428) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Getting domain capabilities for x86_64 via machine types: {'ubuntu-q35', 'pc', 'ubuntu', 'q35'} {{(pid=71428) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=ubuntu-q35: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: /usr/bin/qemu-system-x86_64 Apr 23 03:42:11 user nova-compute[71428]: kvm Apr 23 03:42:11 user nova-compute[71428]: pc-q35-jammy Apr 23 03:42:11 user nova-compute[71428]: x86_64 Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: efi Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: /usr/share/OVMF/OVMF_CODE_4M.ms.fd Apr 23 03:42:11 user nova-compute[71428]: /usr/share/OVMF/OVMF_CODE_4M.secboot.fd Apr 23 03:42:11 user nova-compute[71428]: /usr/share/OVMF/OVMF_CODE_4M.fd Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: rom Apr 23 03:42:11 user nova-compute[71428]: pflash Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: yes Apr 23 03:42:11 user nova-compute[71428]: no Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: yes Apr 23 03:42:11 user nova-compute[71428]: no Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: on Apr 23 03:42:11 user nova-compute[71428]: off Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: on Apr 23 03:42:11 user nova-compute[71428]: off Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: IvyBridge-IBRS Apr 23 03:42:11 user nova-compute[71428]: Intel Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: qemu64 Apr 23 03:42:11 user nova-compute[71428]: qemu32 Apr 23 03:42:11 user nova-compute[71428]: phenom Apr 23 03:42:11 user nova-compute[71428]: pentium3 Apr 23 03:42:11 user nova-compute[71428]: pentium2 Apr 23 03:42:11 user nova-compute[71428]: pentium Apr 23 03:42:11 user nova-compute[71428]: n270 Apr 23 03:42:11 user nova-compute[71428]: kvm64 Apr 23 03:42:11 user nova-compute[71428]: kvm32 Apr 23 03:42:11 user nova-compute[71428]: coreduo Apr 23 03:42:11 user nova-compute[71428]: core2duo Apr 23 03:42:11 user nova-compute[71428]: athlon Apr 23 03:42:11 user nova-compute[71428]: Westmere-IBRS Apr 23 03:42:11 user nova-compute[71428]: Westmere Apr 23 03:42:11 user nova-compute[71428]: Snowridge Apr 23 03:42:11 user nova-compute[71428]: Skylake-Server-noTSX-IBRS Apr 23 03:42:11 user nova-compute[71428]: Skylake-Server-IBRS Apr 23 03:42:11 user nova-compute[71428]: Skylake-Server Apr 23 03:42:11 user nova-compute[71428]: Skylake-Client-noTSX-IBRS Apr 23 03:42:11 user nova-compute[71428]: Skylake-Client-IBRS Apr 23 03:42:11 user nova-compute[71428]: Skylake-Client Apr 23 03:42:11 user nova-compute[71428]: SandyBridge-IBRS Apr 23 03:42:11 user nova-compute[71428]: SandyBridge Apr 23 03:42:11 user nova-compute[71428]: Penryn Apr 23 03:42:11 user nova-compute[71428]: Opteron_G5 Apr 23 03:42:11 user nova-compute[71428]: Opteron_G4 Apr 23 03:42:11 user nova-compute[71428]: Opteron_G3 Apr 23 03:42:11 user nova-compute[71428]: Opteron_G2 Apr 23 03:42:11 user nova-compute[71428]: Opteron_G1 Apr 23 03:42:11 user nova-compute[71428]: Nehalem-IBRS Apr 23 03:42:11 user nova-compute[71428]: Nehalem Apr 23 03:42:11 user nova-compute[71428]: IvyBridge-IBRS Apr 23 03:42:11 user nova-compute[71428]: IvyBridge Apr 23 03:42:11 user nova-compute[71428]: Icelake-Server-noTSX Apr 23 03:42:11 user nova-compute[71428]: Icelake-Server Apr 23 03:42:11 user nova-compute[71428]: Icelake-Client-noTSX Apr 23 03:42:11 user nova-compute[71428]: Icelake-Client Apr 23 03:42:11 user nova-compute[71428]: Haswell-noTSX-IBRS Apr 23 03:42:11 user nova-compute[71428]: Haswell-noTSX Apr 23 03:42:11 user nova-compute[71428]: Haswell-IBRS Apr 23 03:42:11 user nova-compute[71428]: Haswell Apr 23 03:42:11 user nova-compute[71428]: EPYC-Rome Apr 23 03:42:11 user nova-compute[71428]: EPYC-Milan Apr 23 03:42:11 user nova-compute[71428]: EPYC-IBPB Apr 23 03:42:11 user nova-compute[71428]: EPYC Apr 23 03:42:11 user nova-compute[71428]: Dhyana Apr 23 03:42:11 user nova-compute[71428]: Cooperlake Apr 23 03:42:11 user nova-compute[71428]: Conroe Apr 23 03:42:11 user nova-compute[71428]: Cascadelake-Server-noTSX Apr 23 03:42:11 user nova-compute[71428]: Cascadelake-Server Apr 23 03:42:11 user nova-compute[71428]: Broadwell-noTSX-IBRS Apr 23 03:42:11 user nova-compute[71428]: Broadwell-noTSX Apr 23 03:42:11 user nova-compute[71428]: Broadwell-IBRS Apr 23 03:42:11 user nova-compute[71428]: Broadwell Apr 23 03:42:11 user nova-compute[71428]: 486 Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: file Apr 23 03:42:11 user nova-compute[71428]: anonymous Apr 23 03:42:11 user nova-compute[71428]: memfd Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: disk Apr 23 03:42:11 user nova-compute[71428]: cdrom Apr 23 03:42:11 user nova-compute[71428]: floppy Apr 23 03:42:11 user nova-compute[71428]: lun Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: fdc Apr 23 03:42:11 user nova-compute[71428]: scsi Apr 23 03:42:11 user nova-compute[71428]: virtio Apr 23 03:42:11 user nova-compute[71428]: usb Apr 23 03:42:11 user nova-compute[71428]: sata Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: virtio Apr 23 03:42:11 user nova-compute[71428]: virtio-transitional Apr 23 03:42:11 user nova-compute[71428]: virtio-non-transitional Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: sdl Apr 23 03:42:11 user nova-compute[71428]: vnc Apr 23 03:42:11 user nova-compute[71428]: spice Apr 23 03:42:11 user nova-compute[71428]: egl-headless Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: subsystem Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: default Apr 23 03:42:11 user nova-compute[71428]: mandatory Apr 23 03:42:11 user nova-compute[71428]: requisite Apr 23 03:42:11 user nova-compute[71428]: optional Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: usb Apr 23 03:42:11 user nova-compute[71428]: pci Apr 23 03:42:11 user nova-compute[71428]: scsi Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: virtio Apr 23 03:42:11 user nova-compute[71428]: virtio-transitional Apr 23 03:42:11 user nova-compute[71428]: virtio-non-transitional Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: random Apr 23 03:42:11 user nova-compute[71428]: egd Apr 23 03:42:11 user nova-compute[71428]: builtin Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: path Apr 23 03:42:11 user nova-compute[71428]: handle Apr 23 03:42:11 user nova-compute[71428]: virtiofs Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: tpm-tis Apr 23 03:42:11 user nova-compute[71428]: tpm-crb Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: passthrough Apr 23 03:42:11 user nova-compute[71428]: emulator Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: {{(pid=71428) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: /usr/bin/qemu-system-x86_64 Apr 23 03:42:11 user nova-compute[71428]: kvm Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-6.2 Apr 23 03:42:11 user nova-compute[71428]: x86_64 Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: efi Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: /usr/share/OVMF/OVMF_CODE_4M.fd Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: rom Apr 23 03:42:11 user nova-compute[71428]: pflash Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: yes Apr 23 03:42:11 user nova-compute[71428]: no Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: no Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: on Apr 23 03:42:11 user nova-compute[71428]: off Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: on Apr 23 03:42:11 user nova-compute[71428]: off Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: IvyBridge-IBRS Apr 23 03:42:11 user nova-compute[71428]: Intel Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: qemu64 Apr 23 03:42:11 user nova-compute[71428]: qemu32 Apr 23 03:42:11 user nova-compute[71428]: phenom Apr 23 03:42:11 user nova-compute[71428]: pentium3 Apr 23 03:42:11 user nova-compute[71428]: pentium2 Apr 23 03:42:11 user nova-compute[71428]: pentium Apr 23 03:42:11 user nova-compute[71428]: n270 Apr 23 03:42:11 user nova-compute[71428]: kvm64 Apr 23 03:42:11 user nova-compute[71428]: kvm32 Apr 23 03:42:11 user nova-compute[71428]: coreduo Apr 23 03:42:11 user nova-compute[71428]: core2duo Apr 23 03:42:11 user nova-compute[71428]: athlon Apr 23 03:42:11 user nova-compute[71428]: Westmere-IBRS Apr 23 03:42:11 user nova-compute[71428]: Westmere Apr 23 03:42:11 user nova-compute[71428]: Snowridge Apr 23 03:42:11 user nova-compute[71428]: Skylake-Server-noTSX-IBRS Apr 23 03:42:11 user nova-compute[71428]: Skylake-Server-IBRS Apr 23 03:42:11 user nova-compute[71428]: Skylake-Server Apr 23 03:42:11 user nova-compute[71428]: Skylake-Client-noTSX-IBRS Apr 23 03:42:11 user nova-compute[71428]: Skylake-Client-IBRS Apr 23 03:42:11 user nova-compute[71428]: Skylake-Client Apr 23 03:42:11 user nova-compute[71428]: SandyBridge-IBRS Apr 23 03:42:11 user nova-compute[71428]: SandyBridge Apr 23 03:42:11 user nova-compute[71428]: Penryn Apr 23 03:42:11 user nova-compute[71428]: Opteron_G5 Apr 23 03:42:11 user nova-compute[71428]: Opteron_G4 Apr 23 03:42:11 user nova-compute[71428]: Opteron_G3 Apr 23 03:42:11 user nova-compute[71428]: Opteron_G2 Apr 23 03:42:11 user nova-compute[71428]: Opteron_G1 Apr 23 03:42:11 user nova-compute[71428]: Nehalem-IBRS Apr 23 03:42:11 user nova-compute[71428]: Nehalem Apr 23 03:42:11 user nova-compute[71428]: IvyBridge-IBRS Apr 23 03:42:11 user nova-compute[71428]: IvyBridge Apr 23 03:42:11 user nova-compute[71428]: Icelake-Server-noTSX Apr 23 03:42:11 user nova-compute[71428]: Icelake-Server Apr 23 03:42:11 user nova-compute[71428]: Icelake-Client-noTSX Apr 23 03:42:11 user nova-compute[71428]: Icelake-Client Apr 23 03:42:11 user nova-compute[71428]: Haswell-noTSX-IBRS Apr 23 03:42:11 user nova-compute[71428]: Haswell-noTSX Apr 23 03:42:11 user nova-compute[71428]: Haswell-IBRS Apr 23 03:42:11 user nova-compute[71428]: Haswell Apr 23 03:42:11 user nova-compute[71428]: EPYC-Rome Apr 23 03:42:11 user nova-compute[71428]: EPYC-Milan Apr 23 03:42:11 user nova-compute[71428]: EPYC-IBPB Apr 23 03:42:11 user nova-compute[71428]: EPYC Apr 23 03:42:11 user nova-compute[71428]: Dhyana Apr 23 03:42:11 user nova-compute[71428]: Cooperlake Apr 23 03:42:11 user nova-compute[71428]: Conroe Apr 23 03:42:11 user nova-compute[71428]: Cascadelake-Server-noTSX Apr 23 03:42:11 user nova-compute[71428]: Cascadelake-Server Apr 23 03:42:11 user nova-compute[71428]: Broadwell-noTSX-IBRS Apr 23 03:42:11 user nova-compute[71428]: Broadwell-noTSX Apr 23 03:42:11 user nova-compute[71428]: Broadwell-IBRS Apr 23 03:42:11 user nova-compute[71428]: Broadwell Apr 23 03:42:11 user nova-compute[71428]: 486 Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: file Apr 23 03:42:11 user nova-compute[71428]: anonymous Apr 23 03:42:11 user nova-compute[71428]: memfd Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: disk Apr 23 03:42:11 user nova-compute[71428]: cdrom Apr 23 03:42:11 user nova-compute[71428]: floppy Apr 23 03:42:11 user nova-compute[71428]: lun Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: ide Apr 23 03:42:11 user nova-compute[71428]: fdc Apr 23 03:42:11 user nova-compute[71428]: scsi Apr 23 03:42:11 user nova-compute[71428]: virtio Apr 23 03:42:11 user nova-compute[71428]: usb Apr 23 03:42:11 user nova-compute[71428]: sata Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: virtio Apr 23 03:42:11 user nova-compute[71428]: virtio-transitional Apr 23 03:42:11 user nova-compute[71428]: virtio-non-transitional Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: sdl Apr 23 03:42:11 user nova-compute[71428]: vnc Apr 23 03:42:11 user nova-compute[71428]: spice Apr 23 03:42:11 user nova-compute[71428]: egl-headless Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: subsystem Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: default Apr 23 03:42:11 user nova-compute[71428]: mandatory Apr 23 03:42:11 user nova-compute[71428]: requisite Apr 23 03:42:11 user nova-compute[71428]: optional Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: usb Apr 23 03:42:11 user nova-compute[71428]: pci Apr 23 03:42:11 user nova-compute[71428]: scsi Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: virtio Apr 23 03:42:11 user nova-compute[71428]: virtio-transitional Apr 23 03:42:11 user nova-compute[71428]: virtio-non-transitional Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: random Apr 23 03:42:11 user nova-compute[71428]: egd Apr 23 03:42:11 user nova-compute[71428]: builtin Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: path Apr 23 03:42:11 user nova-compute[71428]: handle Apr 23 03:42:11 user nova-compute[71428]: virtiofs Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: tpm-tis Apr 23 03:42:11 user nova-compute[71428]: tpm-crb Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: passthrough Apr 23 03:42:11 user nova-compute[71428]: emulator Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: {{(pid=71428) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=ubuntu: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: /usr/bin/qemu-system-x86_64 Apr 23 03:42:11 user nova-compute[71428]: kvm Apr 23 03:42:11 user nova-compute[71428]: pc-i440fx-jammy Apr 23 03:42:11 user nova-compute[71428]: x86_64 Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: efi Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: /usr/share/OVMF/OVMF_CODE_4M.fd Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: rom Apr 23 03:42:11 user nova-compute[71428]: pflash Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: yes Apr 23 03:42:11 user nova-compute[71428]: no Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: no Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: on Apr 23 03:42:11 user nova-compute[71428]: off Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: on Apr 23 03:42:11 user nova-compute[71428]: off Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: IvyBridge-IBRS Apr 23 03:42:11 user nova-compute[71428]: Intel Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: qemu64 Apr 23 03:42:11 user nova-compute[71428]: qemu32 Apr 23 03:42:11 user nova-compute[71428]: phenom Apr 23 03:42:11 user nova-compute[71428]: pentium3 Apr 23 03:42:11 user nova-compute[71428]: pentium2 Apr 23 03:42:11 user nova-compute[71428]: pentium Apr 23 03:42:11 user nova-compute[71428]: n270 Apr 23 03:42:11 user nova-compute[71428]: kvm64 Apr 23 03:42:11 user nova-compute[71428]: kvm32 Apr 23 03:42:11 user nova-compute[71428]: coreduo Apr 23 03:42:11 user nova-compute[71428]: core2duo Apr 23 03:42:11 user nova-compute[71428]: athlon Apr 23 03:42:11 user nova-compute[71428]: Westmere-IBRS Apr 23 03:42:11 user nova-compute[71428]: Westmere Apr 23 03:42:11 user nova-compute[71428]: Snowridge Apr 23 03:42:11 user nova-compute[71428]: Skylake-Server-noTSX-IBRS Apr 23 03:42:11 user nova-compute[71428]: Skylake-Server-IBRS Apr 23 03:42:11 user nova-compute[71428]: Skylake-Server Apr 23 03:42:11 user nova-compute[71428]: Skylake-Client-noTSX-IBRS Apr 23 03:42:11 user nova-compute[71428]: Skylake-Client-IBRS Apr 23 03:42:11 user nova-compute[71428]: Skylake-Client Apr 23 03:42:11 user nova-compute[71428]: SandyBridge-IBRS Apr 23 03:42:11 user nova-compute[71428]: SandyBridge Apr 23 03:42:11 user nova-compute[71428]: Penryn Apr 23 03:42:11 user nova-compute[71428]: Opteron_G5 Apr 23 03:42:11 user nova-compute[71428]: Opteron_G4 Apr 23 03:42:11 user nova-compute[71428]: Opteron_G3 Apr 23 03:42:11 user nova-compute[71428]: Opteron_G2 Apr 23 03:42:11 user nova-compute[71428]: Opteron_G1 Apr 23 03:42:11 user nova-compute[71428]: Nehalem-IBRS Apr 23 03:42:11 user nova-compute[71428]: Nehalem Apr 23 03:42:11 user nova-compute[71428]: IvyBridge-IBRS Apr 23 03:42:11 user nova-compute[71428]: IvyBridge Apr 23 03:42:11 user nova-compute[71428]: Icelake-Server-noTSX Apr 23 03:42:11 user nova-compute[71428]: Icelake-Server Apr 23 03:42:11 user nova-compute[71428]: Icelake-Client-noTSX Apr 23 03:42:11 user nova-compute[71428]: Icelake-Client Apr 23 03:42:11 user nova-compute[71428]: Haswell-noTSX-IBRS Apr 23 03:42:11 user nova-compute[71428]: Haswell-noTSX Apr 23 03:42:11 user nova-compute[71428]: Haswell-IBRS Apr 23 03:42:11 user nova-compute[71428]: Haswell Apr 23 03:42:11 user nova-compute[71428]: EPYC-Rome Apr 23 03:42:11 user nova-compute[71428]: EPYC-Milan Apr 23 03:42:11 user nova-compute[71428]: EPYC-IBPB Apr 23 03:42:11 user nova-compute[71428]: EPYC Apr 23 03:42:11 user nova-compute[71428]: Dhyana Apr 23 03:42:11 user nova-compute[71428]: Cooperlake Apr 23 03:42:11 user nova-compute[71428]: Conroe Apr 23 03:42:11 user nova-compute[71428]: Cascadelake-Server-noTSX Apr 23 03:42:11 user nova-compute[71428]: Cascadelake-Server Apr 23 03:42:11 user nova-compute[71428]: Broadwell-noTSX-IBRS Apr 23 03:42:11 user nova-compute[71428]: Broadwell-noTSX Apr 23 03:42:11 user nova-compute[71428]: Broadwell-IBRS Apr 23 03:42:11 user nova-compute[71428]: Broadwell Apr 23 03:42:11 user nova-compute[71428]: 486 Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: file Apr 23 03:42:11 user nova-compute[71428]: anonymous Apr 23 03:42:11 user nova-compute[71428]: memfd Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: disk Apr 23 03:42:11 user nova-compute[71428]: cdrom Apr 23 03:42:11 user nova-compute[71428]: floppy Apr 23 03:42:11 user nova-compute[71428]: lun Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: ide Apr 23 03:42:11 user nova-compute[71428]: fdc Apr 23 03:42:11 user nova-compute[71428]: scsi Apr 23 03:42:11 user nova-compute[71428]: virtio Apr 23 03:42:11 user nova-compute[71428]: usb Apr 23 03:42:11 user nova-compute[71428]: sata Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: virtio Apr 23 03:42:11 user nova-compute[71428]: virtio-transitional Apr 23 03:42:11 user nova-compute[71428]: virtio-non-transitional Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: sdl Apr 23 03:42:11 user nova-compute[71428]: vnc Apr 23 03:42:11 user nova-compute[71428]: spice Apr 23 03:42:11 user nova-compute[71428]: egl-headless Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: subsystem Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: default Apr 23 03:42:11 user nova-compute[71428]: mandatory Apr 23 03:42:11 user nova-compute[71428]: requisite Apr 23 03:42:11 user nova-compute[71428]: optional Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: usb Apr 23 03:42:11 user nova-compute[71428]: pci Apr 23 03:42:11 user nova-compute[71428]: scsi Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: virtio Apr 23 03:42:11 user nova-compute[71428]: virtio-transitional Apr 23 03:42:11 user nova-compute[71428]: virtio-non-transitional Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: random Apr 23 03:42:11 user nova-compute[71428]: egd Apr 23 03:42:11 user nova-compute[71428]: builtin Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: path Apr 23 03:42:11 user nova-compute[71428]: handle Apr 23 03:42:11 user nova-compute[71428]: virtiofs Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: tpm-tis Apr 23 03:42:11 user nova-compute[71428]: tpm-crb Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: passthrough Apr 23 03:42:11 user nova-compute[71428]: emulator Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: {{(pid=71428) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: /usr/bin/qemu-system-x86_64 Apr 23 03:42:11 user nova-compute[71428]: kvm Apr 23 03:42:11 user nova-compute[71428]: pc-q35-6.2 Apr 23 03:42:11 user nova-compute[71428]: x86_64 Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: efi Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: /usr/share/OVMF/OVMF_CODE_4M.ms.fd Apr 23 03:42:11 user nova-compute[71428]: /usr/share/OVMF/OVMF_CODE_4M.secboot.fd Apr 23 03:42:11 user nova-compute[71428]: /usr/share/OVMF/OVMF_CODE_4M.fd Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: rom Apr 23 03:42:11 user nova-compute[71428]: pflash Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: yes Apr 23 03:42:11 user nova-compute[71428]: no Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: yes Apr 23 03:42:11 user nova-compute[71428]: no Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: on Apr 23 03:42:11 user nova-compute[71428]: off Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: on Apr 23 03:42:11 user nova-compute[71428]: off Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: IvyBridge-IBRS Apr 23 03:42:11 user nova-compute[71428]: Intel Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: qemu64 Apr 23 03:42:11 user nova-compute[71428]: qemu32 Apr 23 03:42:11 user nova-compute[71428]: phenom Apr 23 03:42:11 user nova-compute[71428]: pentium3 Apr 23 03:42:11 user nova-compute[71428]: pentium2 Apr 23 03:42:11 user nova-compute[71428]: pentium Apr 23 03:42:11 user nova-compute[71428]: n270 Apr 23 03:42:11 user nova-compute[71428]: kvm64 Apr 23 03:42:11 user nova-compute[71428]: kvm32 Apr 23 03:42:11 user nova-compute[71428]: coreduo Apr 23 03:42:11 user nova-compute[71428]: core2duo Apr 23 03:42:11 user nova-compute[71428]: athlon Apr 23 03:42:11 user nova-compute[71428]: Westmere-IBRS Apr 23 03:42:11 user nova-compute[71428]: Westmere Apr 23 03:42:11 user nova-compute[71428]: Snowridge Apr 23 03:42:11 user nova-compute[71428]: Skylake-Server-noTSX-IBRS Apr 23 03:42:11 user nova-compute[71428]: Skylake-Server-IBRS Apr 23 03:42:11 user nova-compute[71428]: Skylake-Server Apr 23 03:42:11 user nova-compute[71428]: Skylake-Client-noTSX-IBRS Apr 23 03:42:11 user nova-compute[71428]: Skylake-Client-IBRS Apr 23 03:42:11 user nova-compute[71428]: Skylake-Client Apr 23 03:42:11 user nova-compute[71428]: SandyBridge-IBRS Apr 23 03:42:11 user nova-compute[71428]: SandyBridge Apr 23 03:42:11 user nova-compute[71428]: Penryn Apr 23 03:42:11 user nova-compute[71428]: Opteron_G5 Apr 23 03:42:11 user nova-compute[71428]: Opteron_G4 Apr 23 03:42:11 user nova-compute[71428]: Opteron_G3 Apr 23 03:42:11 user nova-compute[71428]: Opteron_G2 Apr 23 03:42:11 user nova-compute[71428]: Opteron_G1 Apr 23 03:42:11 user nova-compute[71428]: Nehalem-IBRS Apr 23 03:42:11 user nova-compute[71428]: Nehalem Apr 23 03:42:11 user nova-compute[71428]: IvyBridge-IBRS Apr 23 03:42:11 user nova-compute[71428]: IvyBridge Apr 23 03:42:11 user nova-compute[71428]: Icelake-Server-noTSX Apr 23 03:42:11 user nova-compute[71428]: Icelake-Server Apr 23 03:42:11 user nova-compute[71428]: Icelake-Client-noTSX Apr 23 03:42:11 user nova-compute[71428]: Icelake-Client Apr 23 03:42:11 user nova-compute[71428]: Haswell-noTSX-IBRS Apr 23 03:42:11 user nova-compute[71428]: Haswell-noTSX Apr 23 03:42:11 user nova-compute[71428]: Haswell-IBRS Apr 23 03:42:11 user nova-compute[71428]: Haswell Apr 23 03:42:11 user nova-compute[71428]: EPYC-Rome Apr 23 03:42:11 user nova-compute[71428]: EPYC-Milan Apr 23 03:42:11 user nova-compute[71428]: EPYC-IBPB Apr 23 03:42:11 user nova-compute[71428]: EPYC Apr 23 03:42:11 user nova-compute[71428]: Dhyana Apr 23 03:42:11 user nova-compute[71428]: Cooperlake Apr 23 03:42:11 user nova-compute[71428]: Conroe Apr 23 03:42:11 user nova-compute[71428]: Cascadelake-Server-noTSX Apr 23 03:42:11 user nova-compute[71428]: Cascadelake-Server Apr 23 03:42:11 user nova-compute[71428]: Broadwell-noTSX-IBRS Apr 23 03:42:11 user nova-compute[71428]: Broadwell-noTSX Apr 23 03:42:11 user nova-compute[71428]: Broadwell-IBRS Apr 23 03:42:11 user nova-compute[71428]: Broadwell Apr 23 03:42:11 user nova-compute[71428]: 486 Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: file Apr 23 03:42:11 user nova-compute[71428]: anonymous Apr 23 03:42:11 user nova-compute[71428]: memfd Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: disk Apr 23 03:42:11 user nova-compute[71428]: cdrom Apr 23 03:42:11 user nova-compute[71428]: floppy Apr 23 03:42:11 user nova-compute[71428]: lun Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: fdc Apr 23 03:42:11 user nova-compute[71428]: scsi Apr 23 03:42:11 user nova-compute[71428]: virtio Apr 23 03:42:11 user nova-compute[71428]: usb Apr 23 03:42:11 user nova-compute[71428]: sata Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: virtio Apr 23 03:42:11 user nova-compute[71428]: virtio-transitional Apr 23 03:42:11 user nova-compute[71428]: virtio-non-transitional Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: sdl Apr 23 03:42:11 user nova-compute[71428]: vnc Apr 23 03:42:11 user nova-compute[71428]: spice Apr 23 03:42:11 user nova-compute[71428]: egl-headless Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: subsystem Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: default Apr 23 03:42:11 user nova-compute[71428]: mandatory Apr 23 03:42:11 user nova-compute[71428]: requisite Apr 23 03:42:11 user nova-compute[71428]: optional Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: usb Apr 23 03:42:11 user nova-compute[71428]: pci Apr 23 03:42:11 user nova-compute[71428]: scsi Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: virtio Apr 23 03:42:11 user nova-compute[71428]: virtio-transitional Apr 23 03:42:11 user nova-compute[71428]: virtio-non-transitional Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: random Apr 23 03:42:11 user nova-compute[71428]: egd Apr 23 03:42:11 user nova-compute[71428]: builtin Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: path Apr 23 03:42:11 user nova-compute[71428]: handle Apr 23 03:42:11 user nova-compute[71428]: virtiofs Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: tpm-tis Apr 23 03:42:11 user nova-compute[71428]: tpm-crb Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: passthrough Apr 23 03:42:11 user nova-compute[71428]: emulator Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: {{(pid=71428) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Getting domain capabilities for xtensa via machine types: {None} {{(pid=71428) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Error from libvirt when retrieving domain capabilities for arch xtensa / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-xtensa' on this host {{(pid=71428) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Getting domain capabilities for xtensaeb via machine types: {None} {{(pid=71428) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Error from libvirt when retrieving domain capabilities for arch xtensaeb / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-xtensaeb' on this host {{(pid=71428) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Checking secure boot support for host arch (x86_64) {{(pid=71428) supports_secure_boot /opt/stack/nova/nova/virt/libvirt/host.py:1750}} Apr 23 03:42:11 user nova-compute[71428]: INFO nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Secure Boot support detected Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] cpu compare xml: Apr 23 03:42:11 user nova-compute[71428]: Nehalem Apr 23 03:42:11 user nova-compute[71428]: Apr 23 03:42:11 user nova-compute[71428]: {{(pid=71428) _compare_cpu /opt/stack/nova/nova/virt/libvirt/driver.py:9996}} Apr 23 03:42:11 user nova-compute[71428]: INFO nova.virt.node [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Generated node identity 3017e09c-9289-4a8e-8061-3ff90149e985 Apr 23 03:42:11 user nova-compute[71428]: INFO nova.virt.node [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Wrote node identity 3017e09c-9289-4a8e-8061-3ff90149e985 to /opt/stack/data/nova/compute_id Apr 23 03:42:11 user nova-compute[71428]: WARNING nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Compute nodes ['3017e09c-9289-4a8e-8061-3ff90149e985'] for host user were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. Apr 23 03:42:11 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host Apr 23 03:42:11 user nova-compute[71428]: WARNING nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] No compute node record found for host user. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host user could not be found. Apr 23 03:42:11 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:42:11 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Auditing locally available compute resources for user (node: user) {{(pid=71428) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 23 03:42:12 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:42:12 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:42:12 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Hypervisor/Node resource view: name=user free_ram=10812MB free_disk=26.784809112548828GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71428) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 23 03:42:12 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:42:12 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:42:12 user nova-compute[71428]: WARNING nova.compute.resource_tracker [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] No compute node record for user:3017e09c-9289-4a8e-8061-3ff90149e985: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 3017e09c-9289-4a8e-8061-3ff90149e985 could not be found. Apr 23 03:42:12 user nova-compute[71428]: INFO nova.compute.resource_tracker [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Compute node record created for user:user with uuid: 3017e09c-9289-4a8e-8061-3ff90149e985 Apr 23 03:42:12 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 23 03:42:12 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 23 03:42:12 user nova-compute[71428]: INFO nova.scheduler.client.report [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [req-aa1ba3a5-4bf4-44fa-ad2d-5db748c859ca] Created resource provider record via placement API for resource provider with UUID 3017e09c-9289-4a8e-8061-3ff90149e985 and name user. Apr 23 03:42:12 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] /sys/module/kvm_amd/parameters/sev does not exist {{(pid=71428) _kernel_supports_amd_sev /opt/stack/nova/nova/virt/libvirt/host.py:1766}} Apr 23 03:42:12 user nova-compute[71428]: INFO nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] kernel doesn't support AMD SEV Apr 23 03:42:12 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Updating inventory in ProviderTree for provider 3017e09c-9289-4a8e-8061-3ff90149e985 with inventory: {'MEMORY_MB': {'total': 16023, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 12, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 40, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 0}} {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 23 03:42:12 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71428) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 23 03:42:12 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Libvirt baseline CPU Apr 23 03:42:12 user nova-compute[71428]: x86_64 Apr 23 03:42:12 user nova-compute[71428]: Nehalem Apr 23 03:42:12 user nova-compute[71428]: Intel Apr 23 03:42:12 user nova-compute[71428]: Apr 23 03:42:12 user nova-compute[71428]: Apr 23 03:42:12 user nova-compute[71428]: {{(pid=71428) _get_guest_baseline_cpu_features /opt/stack/nova/nova/virt/libvirt/driver.py:12486}} Apr 23 03:42:12 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Updated inventory for provider 3017e09c-9289-4a8e-8061-3ff90149e985 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 16023, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 12, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 40, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} Apr 23 03:42:12 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Updating resource provider 3017e09c-9289-4a8e-8061-3ff90149e985 generation from 0 to 1 during operation: update_inventory {{(pid=71428) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} Apr 23 03:42:12 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Updating inventory in ProviderTree for provider 3017e09c-9289-4a8e-8061-3ff90149e985 with inventory: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 23 03:42:12 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Updating resource provider 3017e09c-9289-4a8e-8061-3ff90149e985 generation from 1 to 2 during operation: update_traits {{(pid=71428) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} Apr 23 03:42:12 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Compute_service record updated for user:user {{(pid=71428) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 23 03:42:12 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.614s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:42:12 user nova-compute[71428]: DEBUG nova.service [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Creating RPC server for service compute {{(pid=71428) start /opt/stack/nova/nova/service.py:182}} Apr 23 03:42:12 user nova-compute[71428]: DEBUG nova.service [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Join ServiceGroup membership for this service compute {{(pid=71428) start /opt/stack/nova/nova/service.py:199}} Apr 23 03:42:12 user nova-compute[71428]: DEBUG nova.servicegroup.drivers.db [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] DB_Driver: join new ServiceGroup member user to the compute group, service = {{(pid=71428) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} Apr 23 03:42:15 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._sync_power_states {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:42:15 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:43:03 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:43:03 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:43:03 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Starting heal instance info cache {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 23 03:43:03 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Rebuilding the list of instances to heal {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 23 03:43:03 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Didn't find any instances for network info cache update. {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 23 03:43:03 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:43:03 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:43:03 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:43:03 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:43:03 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:43:03 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:43:03 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71428) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 23 03:43:03 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:43:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:43:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:43:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:43:03 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Auditing locally available compute resources for user (node: user) {{(pid=71428) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 23 03:43:03 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:43:03 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:43:03 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Hypervisor/Node resource view: name=user free_ram=10217MB free_disk=26.698631286621094GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71428) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 23 03:43:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:43:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:43:04 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 23 03:43:04 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 23 03:43:04 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:43:04 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:43:04 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Compute_service record updated for user:user {{(pid=71428) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 23 03:43:04 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.374s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:44:04 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:44:04 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:44:04 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:44:04 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:44:04 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71428) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 23 03:44:04 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:44:04 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:44:04 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:44:04 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:44:04 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:44:04 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:44:04 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Auditing locally available compute resources for user (node: user) {{(pid=71428) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 23 03:44:04 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:44:04 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:44:04 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Hypervisor/Node resource view: name=user free_ram=10204MB free_disk=26.744739532470703GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71428) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 23 03:44:04 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:44:04 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:44:05 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 23 03:44:05 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 23 03:44:05 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:44:05 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:44:05 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Compute_service record updated for user:user {{(pid=71428) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 23 03:44:05 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.133s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:44:06 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:44:06 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Starting heal instance info cache {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 23 03:44:06 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Rebuilding the list of instances to heal {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 23 03:44:06 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Didn't find any instances for network info cache update. {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 23 03:44:06 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:44:06 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:45:04 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:45:04 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:45:04 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:45:04 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71428) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 23 03:45:05 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:45:05 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Starting heal instance info cache {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 23 03:45:05 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Rebuilding the list of instances to heal {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 23 03:45:05 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Didn't find any instances for network info cache update. {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 23 03:45:05 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:45:05 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:45:05 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:45:05 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:45:05 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:45:05 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:45:05 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Auditing locally available compute resources for user (node: user) {{(pid=71428) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 23 03:45:05 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:45:05 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:45:05 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Hypervisor/Node resource view: name=user free_ram=10215MB free_disk=26.520755767822266GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71428) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 23 03:45:05 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:45:05 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:45:06 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 23 03:45:06 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 23 03:45:06 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:45:06 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:45:06 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Compute_service record updated for user:user {{(pid=71428) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 23 03:45:06 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.107s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:45:07 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:45:07 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:46:04 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:46:05 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:46:05 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:46:06 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:46:06 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:46:06 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:46:06 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71428) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 23 03:46:07 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:46:07 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Starting heal instance info cache {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 23 03:46:07 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Rebuilding the list of instances to heal {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 23 03:46:07 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Didn't find any instances for network info cache update. {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 23 03:46:07 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:46:07 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:46:07 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:46:07 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:46:07 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:46:07 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:46:07 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Auditing locally available compute resources for user (node: user) {{(pid=71428) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 23 03:46:08 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:46:08 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:46:08 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Hypervisor/Node resource view: name=user free_ram=9409MB free_disk=26.542194366455078GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71428) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 23 03:46:08 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:46:08 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:46:08 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 23 03:46:08 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 23 03:46:08 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:46:08 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:46:08 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Compute_service record updated for user:user {{(pid=71428) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 23 03:46:08 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.115s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:46:14 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Acquiring lock "56d8da41-3e04-465b-a1de-73d9e994682d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:46:14 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Lock "56d8da41-3e04-465b-a1de-73d9e994682d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:46:14 user nova-compute[71428]: DEBUG nova.compute.manager [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Starting instance... {{(pid=71428) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 23 03:46:14 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:46:14 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:46:14 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71428) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 23 03:46:14 user nova-compute[71428]: INFO nova.compute.claims [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Claim successful on node user Apr 23 03:46:15 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:46:15 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:46:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.340s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:46:15 user nova-compute[71428]: DEBUG nova.compute.manager [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Start building networks asynchronously for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 23 03:46:15 user nova-compute[71428]: DEBUG nova.compute.manager [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Allocating IP information in the background. {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 23 03:46:15 user nova-compute[71428]: DEBUG nova.network.neutron [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] allocate_for_instance() {{(pid=71428) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 23 03:46:15 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 23 03:46:15 user nova-compute[71428]: DEBUG nova.compute.manager [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Start building block device mappings for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 23 03:46:15 user nova-compute[71428]: DEBUG nova.compute.manager [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Start spawning the instance on the hypervisor. {{(pid=71428) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 23 03:46:15 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Creating instance directory {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 23 03:46:15 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Creating image(s) Apr 23 03:46:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Acquiring lock "/opt/stack/data/nova/instances/56d8da41-3e04-465b-a1de-73d9e994682d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:46:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Lock "/opt/stack/data/nova/instances/56d8da41-3e04-465b-a1de-73d9e994682d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:46:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Lock "/opt/stack/data/nova/instances/56d8da41-3e04-465b-a1de-73d9e994682d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:46:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Acquiring lock "935dc3474dddbde110531a31510941caddc0ae83" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:46:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:46:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Acquiring lock "f279f3d3-581d-4d6f-924f-4104ec23832a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:46:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Lock "f279f3d3-581d-4d6f-924f-4104ec23832a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:46:15 user nova-compute[71428]: DEBUG nova.compute.manager [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Starting instance... {{(pid=71428) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 23 03:46:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:46:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:46:15 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71428) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 23 03:46:15 user nova-compute[71428]: INFO nova.compute.claims [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Claim successful on node user Apr 23 03:46:15 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:46:15 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:46:16 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83.part --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:46:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.644s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:46:16 user nova-compute[71428]: DEBUG nova.compute.manager [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Start building networks asynchronously for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 23 03:46:16 user nova-compute[71428]: DEBUG nova.policy [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4e6ddab9797c449d85bf6f3a241473e0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c4f91ccdb4da4cb4bc4b55b7ac2189f0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71428) authorize /opt/stack/nova/nova/policy.py:203}} Apr 23 03:46:16 user nova-compute[71428]: DEBUG nova.compute.manager [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Allocating IP information in the background. {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 23 03:46:16 user nova-compute[71428]: DEBUG nova.network.neutron [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] allocate_for_instance() {{(pid=71428) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 23 03:46:16 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83.part --force-share --output=json" returned: 0 in 0.175s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:46:16 user nova-compute[71428]: DEBUG nova.virt.images [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] e6127373-9931-4277-9458-eceef653ea1e was qcow2, converting to raw {{(pid=71428) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Apr 23 03:46:16 user nova-compute[71428]: DEBUG nova.privsep.utils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71428) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 23 03:46:16 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83.part /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83.converted {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:46:16 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 23 03:46:16 user nova-compute[71428]: DEBUG nova.compute.manager [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Start building block device mappings for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 23 03:46:16 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83.part /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83.converted" returned: 0 in 0.215s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:46:16 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83.converted --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:46:16 user nova-compute[71428]: DEBUG nova.compute.manager [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Start spawning the instance on the hypervisor. {{(pid=71428) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 23 03:46:16 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Creating instance directory {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 23 03:46:16 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Creating image(s) Apr 23 03:46:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Acquiring lock "/opt/stack/data/nova/instances/f279f3d3-581d-4d6f-924f-4104ec23832a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:46:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Lock "/opt/stack/data/nova/instances/f279f3d3-581d-4d6f-924f-4104ec23832a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:46:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Lock "/opt/stack/data/nova/instances/f279f3d3-581d-4d6f-924f-4104ec23832a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:46:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Acquiring lock "935dc3474dddbde110531a31510941caddc0ae83" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:46:16 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83.converted --force-share --output=json" returned: 0 in 0.159s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:46:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 1.318s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:46:16 user nova-compute[71428]: INFO oslo.privsep.daemon [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova-cpu.conf', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpg7u63tfh/privsep.sock'] Apr 23 03:46:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.089s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:46:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:46:16 user sudo[80143]: stack : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/nova-rootwrap /etc/nova/rootwrap.conf privsep-helper --config-file /etc/nova/nova-cpu.conf --privsep_context nova.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpg7u63tfh/privsep.sock Apr 23 03:46:16 user sudo[80143]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1001) Apr 23 03:46:16 user nova-compute[71428]: DEBUG nova.policy [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3311b6ae249f41269fde041f6e441840', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5864626fffa7443c800b244471966d65', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71428) authorize /opt/stack/nova/nova/policy.py:203}} Apr 23 03:46:18 user sudo[80143]: pam_unix(sudo:session): session closed for user root Apr 23 03:46:18 user nova-compute[71428]: INFO oslo.privsep.daemon [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Spawned new privsep daemon via rootwrap Apr 23 03:46:18 user nova-compute[71428]: INFO oslo.privsep.daemon [-] privsep daemon starting Apr 23 03:46:18 user nova-compute[71428]: INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0 Apr 23 03:46:18 user nova-compute[71428]: INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none Apr 23 03:46:18 user nova-compute[71428]: INFO oslo.privsep.daemon [-] privsep daemon running as pid 80147 Apr 23 03:46:18 user nova-compute[71428]: WARNING oslo_privsep.priv_context [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] privsep daemon already running Apr 23 03:46:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:46:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:46:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.134s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:46:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Acquiring lock "935dc3474dddbde110531a31510941caddc0ae83" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:46:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:46:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:46:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.139s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:46:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Acquiring lock "935dc3474dddbde110531a31510941caddc0ae83" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:46:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.145s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:46:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/f279f3d3-581d-4d6f-924f-4104ec23832a/disk 1073741824 {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:46:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/f279f3d3-581d-4d6f-924f-4104ec23832a/disk 1073741824" returned: 0 in 0.047s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:46:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.196s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:46:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:46:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.199s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:46:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:46:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.154s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:46:19 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Checking if we can resize image /opt/stack/data/nova/instances/f279f3d3-581d-4d6f-924f-4104ec23832a/disk. size=1073741824 {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 23 03:46:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f279f3d3-581d-4d6f-924f-4104ec23832a/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:46:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.149s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:46:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/56d8da41-3e04-465b-a1de-73d9e994682d/disk 1073741824 {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:46:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/56d8da41-3e04-465b-a1de-73d9e994682d/disk 1073741824" returned: 0 in 0.053s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:46:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.210s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:46:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:46:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f279f3d3-581d-4d6f-924f-4104ec23832a/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:46:19 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Cannot resize image /opt/stack/data/nova/instances/f279f3d3-581d-4d6f-924f-4104ec23832a/disk to a smaller size. {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 23 03:46:19 user nova-compute[71428]: DEBUG nova.objects.instance [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Lazy-loading 'migration_context' on Instance uuid f279f3d3-581d-4d6f-924f-4104ec23832a {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:46:19 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Created local disks {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 23 03:46:19 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Ensure instance console log exists: /opt/stack/data/nova/instances/f279f3d3-581d-4d6f-924f-4104ec23832a/console.log {{(pid=71428) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 23 03:46:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:46:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:46:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:46:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.149s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:46:19 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Checking if we can resize image /opt/stack/data/nova/instances/56d8da41-3e04-465b-a1de-73d9e994682d/disk. size=1073741824 {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 23 03:46:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/56d8da41-3e04-465b-a1de-73d9e994682d/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:46:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/56d8da41-3e04-465b-a1de-73d9e994682d/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:46:19 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Cannot resize image /opt/stack/data/nova/instances/56d8da41-3e04-465b-a1de-73d9e994682d/disk to a smaller size. {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 23 03:46:19 user nova-compute[71428]: DEBUG nova.objects.instance [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Lazy-loading 'migration_context' on Instance uuid 56d8da41-3e04-465b-a1de-73d9e994682d {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:46:19 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Created local disks {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 23 03:46:19 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Ensure instance console log exists: /opt/stack/data/nova/instances/56d8da41-3e04-465b-a1de-73d9e994682d/console.log {{(pid=71428) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 23 03:46:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:46:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:46:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:46:20 user nova-compute[71428]: DEBUG nova.network.neutron [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Successfully created port: 552bc11e-f01b-4ee4-94e0-64ab1e19ad9e {{(pid=71428) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 23 03:46:21 user nova-compute[71428]: DEBUG nova.network.neutron [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Successfully created port: 680802e3-0304-496f-927b-855e4167272b {{(pid=71428) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 23 03:46:22 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Acquiring lock "08d906c5-1698-4c17-8430-c98f10836398" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:46:22 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lock "08d906c5-1698-4c17-8430-c98f10836398" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:46:22 user nova-compute[71428]: DEBUG nova.compute.manager [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Starting instance... {{(pid=71428) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 23 03:46:22 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:46:22 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:46:22 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71428) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 23 03:46:22 user nova-compute[71428]: INFO nova.compute.claims [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Claim successful on node user Apr 23 03:46:23 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:46:23 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:46:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.309s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:46:23 user nova-compute[71428]: DEBUG nova.compute.manager [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Start building networks asynchronously for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 23 03:46:23 user nova-compute[71428]: DEBUG nova.compute.manager [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Allocating IP information in the background. {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 23 03:46:23 user nova-compute[71428]: DEBUG nova.network.neutron [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 08d906c5-1698-4c17-8430-c98f10836398] allocate_for_instance() {{(pid=71428) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 23 03:46:23 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 23 03:46:23 user nova-compute[71428]: DEBUG nova.compute.manager [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Start building block device mappings for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 23 03:46:23 user nova-compute[71428]: DEBUG nova.policy [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e3d689f1c160478ca83bbff3104d8ec3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '70b031ddc5c94ca98e7161de03bda4b7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71428) authorize /opt/stack/nova/nova/policy.py:203}} Apr 23 03:46:23 user nova-compute[71428]: DEBUG nova.compute.manager [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Start spawning the instance on the hypervisor. {{(pid=71428) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 23 03:46:23 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Creating instance directory {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 23 03:46:23 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Creating image(s) Apr 23 03:46:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Acquiring lock "/opt/stack/data/nova/instances/08d906c5-1698-4c17-8430-c98f10836398/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:46:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lock "/opt/stack/data/nova/instances/08d906c5-1698-4c17-8430-c98f10836398/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:46:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lock "/opt/stack/data/nova/instances/08d906c5-1698-4c17-8430-c98f10836398/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:46:23 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:46:23 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.164s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:46:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Acquiring lock "935dc3474dddbde110531a31510941caddc0ae83" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:46:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:46:23 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:46:23 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.159s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:46:23 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/08d906c5-1698-4c17-8430-c98f10836398/disk 1073741824 {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:46:23 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/08d906c5-1698-4c17-8430-c98f10836398/disk 1073741824" returned: 0 in 0.054s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:46:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.219s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:46:23 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:46:24 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.121s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:46:24 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Checking if we can resize image /opt/stack/data/nova/instances/08d906c5-1698-4c17-8430-c98f10836398/disk. size=1073741824 {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 23 03:46:24 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/08d906c5-1698-4c17-8430-c98f10836398/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:46:24 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/08d906c5-1698-4c17-8430-c98f10836398/disk --force-share --output=json" returned: 0 in 0.122s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:46:24 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Cannot resize image /opt/stack/data/nova/instances/08d906c5-1698-4c17-8430-c98f10836398/disk to a smaller size. {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 23 03:46:24 user nova-compute[71428]: DEBUG nova.objects.instance [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lazy-loading 'migration_context' on Instance uuid 08d906c5-1698-4c17-8430-c98f10836398 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:46:24 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Created local disks {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 23 03:46:24 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Ensure instance console log exists: /opt/stack/data/nova/instances/08d906c5-1698-4c17-8430-c98f10836398/console.log {{(pid=71428) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 23 03:46:24 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:46:24 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:46:24 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:46:24 user nova-compute[71428]: DEBUG nova.network.neutron [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Successfully updated port: 552bc11e-f01b-4ee4-94e0-64ab1e19ad9e {{(pid=71428) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 23 03:46:24 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Acquiring lock "refresh_cache-56d8da41-3e04-465b-a1de-73d9e994682d" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:46:24 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Acquired lock "refresh_cache-56d8da41-3e04-465b-a1de-73d9e994682d" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:46:24 user nova-compute[71428]: DEBUG nova.network.neutron [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Building network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 23 03:46:24 user nova-compute[71428]: DEBUG nova.network.neutron [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Successfully updated port: 680802e3-0304-496f-927b-855e4167272b {{(pid=71428) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 23 03:46:24 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Acquiring lock "refresh_cache-f279f3d3-581d-4d6f-924f-4104ec23832a" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:46:24 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Acquired lock "refresh_cache-f279f3d3-581d-4d6f-924f-4104ec23832a" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:46:24 user nova-compute[71428]: DEBUG nova.network.neutron [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Building network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 23 03:46:25 user nova-compute[71428]: DEBUG nova.network.neutron [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Instance cache missing network info. {{(pid=71428) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 23 03:46:25 user nova-compute[71428]: DEBUG nova.network.neutron [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Instance cache missing network info. {{(pid=71428) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 23 03:46:25 user nova-compute[71428]: DEBUG nova.compute.manager [req-424119bd-0262-4556-b0d5-3aa8745b50d7 req-4b8b3802-21b4-4360-8282-87acf87839eb service nova] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Received event network-changed-552bc11e-f01b-4ee4-94e0-64ab1e19ad9e {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:46:25 user nova-compute[71428]: DEBUG nova.compute.manager [req-424119bd-0262-4556-b0d5-3aa8745b50d7 req-4b8b3802-21b4-4360-8282-87acf87839eb service nova] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Refreshing instance network info cache due to event network-changed-552bc11e-f01b-4ee4-94e0-64ab1e19ad9e. {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 23 03:46:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-424119bd-0262-4556-b0d5-3aa8745b50d7 req-4b8b3802-21b4-4360-8282-87acf87839eb service nova] Acquiring lock "refresh_cache-56d8da41-3e04-465b-a1de-73d9e994682d" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:46:25 user nova-compute[71428]: DEBUG nova.network.neutron [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Successfully created port: 71dae390-2e66-4961-8ce7-1b8fff845732 {{(pid=71428) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.network.neutron [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Updating instance_info_cache with network_info: [{"id": "552bc11e-f01b-4ee4-94e0-64ab1e19ad9e", "address": "fa:16:3e:c8:1e:d4", "network": {"id": "47db66e9-5162-4031-9d71-5e13c16ee002", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1314718942-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c4f91ccdb4da4cb4bc4b55b7ac2189f0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap552bc11e-f0", "ovs_interfaceid": "552bc11e-f01b-4ee4-94e0-64ab1e19ad9e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.network.neutron [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Updating instance_info_cache with network_info: [{"id": "680802e3-0304-496f-927b-855e4167272b", "address": "fa:16:3e:45:76:bf", "network": {"id": "2354786d-cb38-48f3-9585-d88f27219024", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-812722065-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5864626fffa7443c800b244471966d65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap680802e3-03", "ovs_interfaceid": "680802e3-0304-496f-927b-855e4167272b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Releasing lock "refresh_cache-56d8da41-3e04-465b-a1de-73d9e994682d" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.compute.manager [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Instance network_info: |[{"id": "552bc11e-f01b-4ee4-94e0-64ab1e19ad9e", "address": "fa:16:3e:c8:1e:d4", "network": {"id": "47db66e9-5162-4031-9d71-5e13c16ee002", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1314718942-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c4f91ccdb4da4cb4bc4b55b7ac2189f0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap552bc11e-f0", "ovs_interfaceid": "552bc11e-f01b-4ee4-94e0-64ab1e19ad9e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-424119bd-0262-4556-b0d5-3aa8745b50d7 req-4b8b3802-21b4-4360-8282-87acf87839eb service nova] Acquired lock "refresh_cache-56d8da41-3e04-465b-a1de-73d9e994682d" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.network.neutron [req-424119bd-0262-4556-b0d5-3aa8745b50d7 req-4b8b3802-21b4-4360-8282-87acf87839eb service nova] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Refreshing network info cache for port 552bc11e-f01b-4ee4-94e0-64ab1e19ad9e {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Start _get_guest_xml network_info=[{"id": "552bc11e-f01b-4ee4-94e0-64ab1e19ad9e", "address": "fa:16:3e:c8:1e:d4", "network": {"id": "47db66e9-5162-4031-9d71-5e13c16ee002", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1314718942-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c4f91ccdb4da4cb4bc4b55b7ac2189f0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap552bc11e-f0", "ovs_interfaceid": "552bc11e-f01b-4ee4-94e0-64ab1e19ad9e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'image_id': 'e6127373-9931-4277-9458-eceef653ea1e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 23 03:46:26 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:46:26 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71428) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-23T03:43:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=), allow threads: True {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Flavor limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Image limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Flavor pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Image pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Got 1 possible topologies {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.privsep.utils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71428) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:46:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1138104',display_name='tempest-DeleteServersTestJSON-server-1138104',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-deleteserverstestjson-server-1138104',id=1,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4f91ccdb4da4cb4bc4b55b7ac2189f0',ramdisk_id='',reservation_id='r-kou1b0nk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-2062396414',owner_user_name='tempest-DeleteServersTestJSON-2062396414-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:46:15Z,user_data=None,user_id='4e6ddab9797c449d85bf6f3a241473e0',uuid=56d8da41-3e04-465b-a1de-73d9e994682d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "552bc11e-f01b-4ee4-94e0-64ab1e19ad9e", "address": "fa:16:3e:c8:1e:d4", "network": {"id": "47db66e9-5162-4031-9d71-5e13c16ee002", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1314718942-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c4f91ccdb4da4cb4bc4b55b7ac2189f0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap552bc11e-f0", "ovs_interfaceid": "552bc11e-f01b-4ee4-94e0-64ab1e19ad9e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71428) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Converting VIF {"id": "552bc11e-f01b-4ee4-94e0-64ab1e19ad9e", "address": "fa:16:3e:c8:1e:d4", "network": {"id": "47db66e9-5162-4031-9d71-5e13c16ee002", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1314718942-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c4f91ccdb4da4cb4bc4b55b7ac2189f0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap552bc11e-f0", "ovs_interfaceid": "552bc11e-f01b-4ee4-94e0-64ab1e19ad9e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c8:1e:d4,bridge_name='br-int',has_traffic_filtering=True,id=552bc11e-f01b-4ee4-94e0-64ab1e19ad9e,network=Network(47db66e9-5162-4031-9d71-5e13c16ee002),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap552bc11e-f0') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.objects.instance [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Lazy-loading 'pci_devices' on Instance uuid 56d8da41-3e04-465b-a1de-73d9e994682d {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Releasing lock "refresh_cache-f279f3d3-581d-4d6f-924f-4104ec23832a" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.compute.manager [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Instance network_info: |[{"id": "680802e3-0304-496f-927b-855e4167272b", "address": "fa:16:3e:45:76:bf", "network": {"id": "2354786d-cb38-48f3-9585-d88f27219024", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-812722065-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5864626fffa7443c800b244471966d65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap680802e3-03", "ovs_interfaceid": "680802e3-0304-496f-927b-855e4167272b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Start _get_guest_xml network_info=[{"id": "680802e3-0304-496f-927b-855e4167272b", "address": "fa:16:3e:45:76:bf", "network": {"id": "2354786d-cb38-48f3-9585-d88f27219024", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-812722065-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5864626fffa7443c800b244471966d65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap680802e3-03", "ovs_interfaceid": "680802e3-0304-496f-927b-855e4167272b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'image_id': 'e6127373-9931-4277-9458-eceef653ea1e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 23 03:46:26 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:46:26 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71428) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-23T03:43:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=), allow threads: True {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Flavor limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Image limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Flavor pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Image pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Got 1 possible topologies {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:46:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-329028935',display_name='tempest-ServerStableDeviceRescueTest-server-329028935',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverstabledevicerescuetest-server-329028935',id=2,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJloyUtsCDIs9T1+8g/hMHz6r8pR2SR9PggBoD/KR0LjhZoNK/nEt54tYO6knV7E+845Tt0p37M8EMNHSd7z4jcysmifK+fuISVR4KNyRapKSmK+MZDTAfLCZ5mhzyFU5A==',key_name='tempest-keypair-884305512',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5864626fffa7443c800b244471966d65',ramdisk_id='',reservation_id='r-lnebeump',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-847110386',owner_user_name='tempest-ServerStableDeviceRescueTest-847110386-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:46:17Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3311b6ae249f41269fde041f6e441840',uuid=f279f3d3-581d-4d6f-924f-4104ec23832a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "680802e3-0304-496f-927b-855e4167272b", "address": "fa:16:3e:45:76:bf", "network": {"id": "2354786d-cb38-48f3-9585-d88f27219024", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-812722065-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5864626fffa7443c800b244471966d65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap680802e3-03", "ovs_interfaceid": "680802e3-0304-496f-927b-855e4167272b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71428) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Converting VIF {"id": "680802e3-0304-496f-927b-855e4167272b", "address": "fa:16:3e:45:76:bf", "network": {"id": "2354786d-cb38-48f3-9585-d88f27219024", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-812722065-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5864626fffa7443c800b244471966d65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap680802e3-03", "ovs_interfaceid": "680802e3-0304-496f-927b-855e4167272b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:76:bf,bridge_name='br-int',has_traffic_filtering=True,id=680802e3-0304-496f-927b-855e4167272b,network=Network(2354786d-cb38-48f3-9585-d88f27219024),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap680802e3-03') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.objects.instance [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Lazy-loading 'pci_devices' on Instance uuid f279f3d3-581d-4d6f-924f-4104ec23832a {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] End _get_guest_xml xml= Apr 23 03:46:26 user nova-compute[71428]: 56d8da41-3e04-465b-a1de-73d9e994682d Apr 23 03:46:26 user nova-compute[71428]: instance-00000001 Apr 23 03:46:26 user nova-compute[71428]: 131072 Apr 23 03:46:26 user nova-compute[71428]: 1 Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: tempest-DeleteServersTestJSON-server-1138104 Apr 23 03:46:26 user nova-compute[71428]: 2023-04-23 03:46:26 Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: 128 Apr 23 03:46:26 user nova-compute[71428]: 1 Apr 23 03:46:26 user nova-compute[71428]: 0 Apr 23 03:46:26 user nova-compute[71428]: 0 Apr 23 03:46:26 user nova-compute[71428]: 1 Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: tempest-DeleteServersTestJSON-2062396414-project-member Apr 23 03:46:26 user nova-compute[71428]: tempest-DeleteServersTestJSON-2062396414 Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: OpenStack Foundation Apr 23 03:46:26 user nova-compute[71428]: OpenStack Nova Apr 23 03:46:26 user nova-compute[71428]: 0.0.0 Apr 23 03:46:26 user nova-compute[71428]: 56d8da41-3e04-465b-a1de-73d9e994682d Apr 23 03:46:26 user nova-compute[71428]: 56d8da41-3e04-465b-a1de-73d9e994682d Apr 23 03:46:26 user nova-compute[71428]: Virtual Machine Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: hvm Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Nehalem Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: /dev/urandom Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:46:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1138104',display_name='tempest-DeleteServersTestJSON-server-1138104',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-deleteserverstestjson-server-1138104',id=1,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4f91ccdb4da4cb4bc4b55b7ac2189f0',ramdisk_id='',reservation_id='r-kou1b0nk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-2062396414',owner_user_name='tempest-DeleteServersTestJSON-2062396414-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:46:15Z,user_data=None,user_id='4e6ddab9797c449d85bf6f3a241473e0',uuid=56d8da41-3e04-465b-a1de-73d9e994682d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "552bc11e-f01b-4ee4-94e0-64ab1e19ad9e", "address": "fa:16:3e:c8:1e:d4", "network": {"id": "47db66e9-5162-4031-9d71-5e13c16ee002", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1314718942-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c4f91ccdb4da4cb4bc4b55b7ac2189f0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap552bc11e-f0", "ovs_interfaceid": "552bc11e-f01b-4ee4-94e0-64ab1e19ad9e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Converting VIF {"id": "552bc11e-f01b-4ee4-94e0-64ab1e19ad9e", "address": "fa:16:3e:c8:1e:d4", "network": {"id": "47db66e9-5162-4031-9d71-5e13c16ee002", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1314718942-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c4f91ccdb4da4cb4bc4b55b7ac2189f0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap552bc11e-f0", "ovs_interfaceid": "552bc11e-f01b-4ee4-94e0-64ab1e19ad9e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c8:1e:d4,bridge_name='br-int',has_traffic_filtering=True,id=552bc11e-f01b-4ee4-94e0-64ab1e19ad9e,network=Network(47db66e9-5162-4031-9d71-5e13c16ee002),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap552bc11e-f0') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG os_vif [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c8:1e:d4,bridge_name='br-int',has_traffic_filtering=True,id=552bc11e-f01b-4ee4-94e0-64ab1e19ad9e,network=Network(47db66e9-5162-4031-9d71-5e13c16ee002),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap552bc11e-f0') {{(pid=71428) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] End _get_guest_xml xml= Apr 23 03:46:26 user nova-compute[71428]: f279f3d3-581d-4d6f-924f-4104ec23832a Apr 23 03:46:26 user nova-compute[71428]: instance-00000002 Apr 23 03:46:26 user nova-compute[71428]: 131072 Apr 23 03:46:26 user nova-compute[71428]: 1 Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: tempest-ServerStableDeviceRescueTest-server-329028935 Apr 23 03:46:26 user nova-compute[71428]: 2023-04-23 03:46:26 Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: 128 Apr 23 03:46:26 user nova-compute[71428]: 1 Apr 23 03:46:26 user nova-compute[71428]: 0 Apr 23 03:46:26 user nova-compute[71428]: 0 Apr 23 03:46:26 user nova-compute[71428]: 1 Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: tempest-ServerStableDeviceRescueTest-847110386-project-member Apr 23 03:46:26 user nova-compute[71428]: tempest-ServerStableDeviceRescueTest-847110386 Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: OpenStack Foundation Apr 23 03:46:26 user nova-compute[71428]: OpenStack Nova Apr 23 03:46:26 user nova-compute[71428]: 0.0.0 Apr 23 03:46:26 user nova-compute[71428]: f279f3d3-581d-4d6f-924f-4104ec23832a Apr 23 03:46:26 user nova-compute[71428]: f279f3d3-581d-4d6f-924f-4104ec23832a Apr 23 03:46:26 user nova-compute[71428]: Virtual Machine Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: hvm Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Nehalem Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: /dev/urandom Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: Apr 23 03:46:26 user nova-compute[71428]: {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:46:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-329028935',display_name='tempest-ServerStableDeviceRescueTest-server-329028935',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverstabledevicerescuetest-server-329028935',id=2,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJloyUtsCDIs9T1+8g/hMHz6r8pR2SR9PggBoD/KR0LjhZoNK/nEt54tYO6knV7E+845Tt0p37M8EMNHSd7z4jcysmifK+fuISVR4KNyRapKSmK+MZDTAfLCZ5mhzyFU5A==',key_name='tempest-keypair-884305512',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5864626fffa7443c800b244471966d65',ramdisk_id='',reservation_id='r-lnebeump',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-847110386',owner_user_name='tempest-ServerStableDeviceRescueTest-847110386-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:46:17Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3311b6ae249f41269fde041f6e441840',uuid=f279f3d3-581d-4d6f-924f-4104ec23832a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "680802e3-0304-496f-927b-855e4167272b", "address": "fa:16:3e:45:76:bf", "network": {"id": "2354786d-cb38-48f3-9585-d88f27219024", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-812722065-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5864626fffa7443c800b244471966d65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap680802e3-03", "ovs_interfaceid": "680802e3-0304-496f-927b-855e4167272b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Converting VIF {"id": "680802e3-0304-496f-927b-855e4167272b", "address": "fa:16:3e:45:76:bf", "network": {"id": "2354786d-cb38-48f3-9585-d88f27219024", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-812722065-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5864626fffa7443c800b244471966d65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap680802e3-03", "ovs_interfaceid": "680802e3-0304-496f-927b-855e4167272b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:76:bf,bridge_name='br-int',has_traffic_filtering=True,id=680802e3-0304-496f-927b-855e4167272b,network=Network(2354786d-cb38-48f3-9585-d88f27219024),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap680802e3-03') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG os_vif [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:76:bf,bridge_name='br-int',has_traffic_filtering=True,id=680802e3-0304-496f-927b-855e4167272b,network=Network(2354786d-cb38-48f3-9585-d88f27219024),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap680802e3-03') {{(pid=71428) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Created schema index Interface.name {{(pid=71428) autocreate_indices /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/__init__.py:106}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Created schema index Port.name {{(pid=71428) autocreate_indices /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/__init__.py:106}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Created schema index Bridge.name {{(pid=71428) autocreate_indices /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/__init__.py:106}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] tcp:127.0.0.1:6640: entering CONNECTING {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [POLLIN] on fd 23 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] tcp:127.0.0.1:6640: entering BACKOFF {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] schema index .name already exists {{(pid=71428) autocreate_indices /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/__init__.py:102}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] schema index .name already exists {{(pid=71428) autocreate_indices /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/__init__.py:102}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] schema index .name already exists {{(pid=71428) autocreate_indices /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/__init__.py:102}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [POLLOUT] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:26 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:27 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:27 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:27 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:46:27 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 23 03:46:27 user nova-compute[71428]: INFO oslo.privsep.daemon [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova-cpu.conf', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp3dhjwqil/privsep.sock'] Apr 23 03:46:27 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:27 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:46:27 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 23 03:46:27 user sudo[80215]: stack : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/nova-rootwrap /etc/nova/rootwrap.conf privsep-helper --config-file /etc/nova/nova-cpu.conf --privsep_context vif_plug_ovs.privsep.vif_plug --privsep_sock_path /tmp/tmp3dhjwqil/privsep.sock Apr 23 03:46:27 user sudo[80215]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1001) Apr 23 03:46:27 user nova-compute[71428]: DEBUG nova.compute.manager [req-239e0da9-64f5-41de-888b-13b57acd0a10 req-f2ca9480-a081-4d09-9bbd-24568775b160 service nova] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Received event network-changed-680802e3-0304-496f-927b-855e4167272b {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:46:27 user nova-compute[71428]: DEBUG nova.compute.manager [req-239e0da9-64f5-41de-888b-13b57acd0a10 req-f2ca9480-a081-4d09-9bbd-24568775b160 service nova] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Refreshing instance network info cache due to event network-changed-680802e3-0304-496f-927b-855e4167272b. {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 23 03:46:27 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-239e0da9-64f5-41de-888b-13b57acd0a10 req-f2ca9480-a081-4d09-9bbd-24568775b160 service nova] Acquiring lock "refresh_cache-f279f3d3-581d-4d6f-924f-4104ec23832a" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:46:27 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-239e0da9-64f5-41de-888b-13b57acd0a10 req-f2ca9480-a081-4d09-9bbd-24568775b160 service nova] Acquired lock "refresh_cache-f279f3d3-581d-4d6f-924f-4104ec23832a" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:46:27 user nova-compute[71428]: DEBUG nova.network.neutron [req-239e0da9-64f5-41de-888b-13b57acd0a10 req-f2ca9480-a081-4d09-9bbd-24568775b160 service nova] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Refreshing network info cache for port 680802e3-0304-496f-927b-855e4167272b {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 23 03:46:28 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:28 user sudo[80215]: pam_unix(sudo:session): session closed for user root Apr 23 03:46:28 user nova-compute[71428]: INFO oslo.privsep.daemon [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Spawned new privsep daemon via rootwrap Apr 23 03:46:28 user nova-compute[71428]: INFO oslo.privsep.daemon [-] privsep daemon starting Apr 23 03:46:28 user nova-compute[71428]: INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0 Apr 23 03:46:28 user nova-compute[71428]: INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none Apr 23 03:46:28 user nova-compute[71428]: INFO oslo.privsep.daemon [-] privsep daemon running as pid 80219 Apr 23 03:46:28 user nova-compute[71428]: WARNING oslo_privsep.priv_context [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] privsep daemon already running Apr 23 03:46:28 user nova-compute[71428]: DEBUG nova.network.neutron [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Successfully updated port: 71dae390-2e66-4961-8ce7-1b8fff845732 {{(pid=71428) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 23 03:46:28 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Acquiring lock "refresh_cache-08d906c5-1698-4c17-8430-c98f10836398" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:46:28 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Acquired lock "refresh_cache-08d906c5-1698-4c17-8430-c98f10836398" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:46:28 user nova-compute[71428]: DEBUG nova.network.neutron [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Building network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 23 03:46:29 user nova-compute[71428]: DEBUG nova.network.neutron [req-424119bd-0262-4556-b0d5-3aa8745b50d7 req-4b8b3802-21b4-4360-8282-87acf87839eb service nova] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Updated VIF entry in instance network info cache for port 552bc11e-f01b-4ee4-94e0-64ab1e19ad9e. {{(pid=71428) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 23 03:46:29 user nova-compute[71428]: DEBUG nova.network.neutron [req-424119bd-0262-4556-b0d5-3aa8745b50d7 req-4b8b3802-21b4-4360-8282-87acf87839eb service nova] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Updating instance_info_cache with network_info: [{"id": "552bc11e-f01b-4ee4-94e0-64ab1e19ad9e", "address": "fa:16:3e:c8:1e:d4", "network": {"id": "47db66e9-5162-4031-9d71-5e13c16ee002", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1314718942-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c4f91ccdb4da4cb4bc4b55b7ac2189f0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap552bc11e-f0", "ovs_interfaceid": "552bc11e-f01b-4ee4-94e0-64ab1e19ad9e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:46:29 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-424119bd-0262-4556-b0d5-3aa8745b50d7 req-4b8b3802-21b4-4360-8282-87acf87839eb service nova] Releasing lock "refresh_cache-56d8da41-3e04-465b-a1de-73d9e994682d" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:46:29 user nova-compute[71428]: DEBUG nova.network.neutron [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Instance cache missing network info. {{(pid=71428) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 23 03:46:29 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:29 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap680802e3-03, may_exist=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:46:29 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap680802e3-03, col_values=(('external_ids', {'iface-id': '680802e3-0304-496f-927b-855e4167272b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:45:76:bf', 'vm-uuid': 'f279f3d3-581d-4d6f-924f-4104ec23832a'}),)) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:46:29 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:29 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:46:29 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:29 user nova-compute[71428]: INFO os_vif [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:76:bf,bridge_name='br-int',has_traffic_filtering=True,id=680802e3-0304-496f-927b-855e4167272b,network=Network(2354786d-cb38-48f3-9585-d88f27219024),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap680802e3-03') Apr 23 03:46:29 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:29 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap552bc11e-f0, may_exist=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:46:29 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap552bc11e-f0, col_values=(('external_ids', {'iface-id': '552bc11e-f01b-4ee4-94e0-64ab1e19ad9e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c8:1e:d4', 'vm-uuid': '56d8da41-3e04-465b-a1de-73d9e994682d'}),)) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:46:29 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:29 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:46:29 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:29 user nova-compute[71428]: INFO os_vif [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c8:1e:d4,bridge_name='br-int',has_traffic_filtering=True,id=552bc11e-f01b-4ee4-94e0-64ab1e19ad9e,network=Network(47db66e9-5162-4031-9d71-5e13c16ee002),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap552bc11e-f0') Apr 23 03:46:29 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] No BDM found with device name vda, not building metadata. {{(pid=71428) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 23 03:46:29 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] No VIF found with MAC fa:16:3e:45:76:bf, not building metadata {{(pid=71428) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 23 03:46:29 user nova-compute[71428]: DEBUG nova.compute.manager [req-4d9ceb06-224f-4709-bfcf-0aff0134daf6 req-cf841a3d-bc2a-45d8-be74-2f2e225465ea service nova] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Received event network-changed-71dae390-2e66-4961-8ce7-1b8fff845732 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:46:29 user nova-compute[71428]: DEBUG nova.compute.manager [req-4d9ceb06-224f-4709-bfcf-0aff0134daf6 req-cf841a3d-bc2a-45d8-be74-2f2e225465ea service nova] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Refreshing instance network info cache due to event network-changed-71dae390-2e66-4961-8ce7-1b8fff845732. {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 23 03:46:29 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-4d9ceb06-224f-4709-bfcf-0aff0134daf6 req-cf841a3d-bc2a-45d8-be74-2f2e225465ea service nova] Acquiring lock "refresh_cache-08d906c5-1698-4c17-8430-c98f10836398" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:46:29 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] No BDM found with device name vda, not building metadata. {{(pid=71428) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 23 03:46:29 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] No VIF found with MAC fa:16:3e:c8:1e:d4, not building metadata {{(pid=71428) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 23 03:46:29 user nova-compute[71428]: DEBUG nova.network.neutron [req-239e0da9-64f5-41de-888b-13b57acd0a10 req-f2ca9480-a081-4d09-9bbd-24568775b160 service nova] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Updated VIF entry in instance network info cache for port 680802e3-0304-496f-927b-855e4167272b. {{(pid=71428) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 23 03:46:29 user nova-compute[71428]: DEBUG nova.network.neutron [req-239e0da9-64f5-41de-888b-13b57acd0a10 req-f2ca9480-a081-4d09-9bbd-24568775b160 service nova] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Updating instance_info_cache with network_info: [{"id": "680802e3-0304-496f-927b-855e4167272b", "address": "fa:16:3e:45:76:bf", "network": {"id": "2354786d-cb38-48f3-9585-d88f27219024", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-812722065-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5864626fffa7443c800b244471966d65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap680802e3-03", "ovs_interfaceid": "680802e3-0304-496f-927b-855e4167272b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:46:29 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-239e0da9-64f5-41de-888b-13b57acd0a10 req-f2ca9480-a081-4d09-9bbd-24568775b160 service nova] Releasing lock "refresh_cache-f279f3d3-581d-4d6f-924f-4104ec23832a" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:46:30 user nova-compute[71428]: DEBUG nova.network.neutron [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Updating instance_info_cache with network_info: [{"id": "71dae390-2e66-4961-8ce7-1b8fff845732", "address": "fa:16:3e:6e:74:5a", "network": {"id": "5604c946-dd68-4517-9c39-b4d33d86a345", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1424127131-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70b031ddc5c94ca98e7161de03bda4b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap71dae390-2e", "ovs_interfaceid": "71dae390-2e66-4961-8ce7-1b8fff845732", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:46:30 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Releasing lock "refresh_cache-08d906c5-1698-4c17-8430-c98f10836398" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:46:30 user nova-compute[71428]: DEBUG nova.compute.manager [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Instance network_info: |[{"id": "71dae390-2e66-4961-8ce7-1b8fff845732", "address": "fa:16:3e:6e:74:5a", "network": {"id": "5604c946-dd68-4517-9c39-b4d33d86a345", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1424127131-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70b031ddc5c94ca98e7161de03bda4b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap71dae390-2e", "ovs_interfaceid": "71dae390-2e66-4961-8ce7-1b8fff845732", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 23 03:46:30 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-4d9ceb06-224f-4709-bfcf-0aff0134daf6 req-cf841a3d-bc2a-45d8-be74-2f2e225465ea service nova] Acquired lock "refresh_cache-08d906c5-1698-4c17-8430-c98f10836398" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:46:30 user nova-compute[71428]: DEBUG nova.network.neutron [req-4d9ceb06-224f-4709-bfcf-0aff0134daf6 req-cf841a3d-bc2a-45d8-be74-2f2e225465ea service nova] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Refreshing network info cache for port 71dae390-2e66-4961-8ce7-1b8fff845732 {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 23 03:46:30 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Start _get_guest_xml network_info=[{"id": "71dae390-2e66-4961-8ce7-1b8fff845732", "address": "fa:16:3e:6e:74:5a", "network": {"id": "5604c946-dd68-4517-9c39-b4d33d86a345", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1424127131-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70b031ddc5c94ca98e7161de03bda4b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap71dae390-2e", "ovs_interfaceid": "71dae390-2e66-4961-8ce7-1b8fff845732", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'image_id': 'e6127373-9931-4277-9458-eceef653ea1e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 23 03:46:30 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:46:30 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:46:30 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71428) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 23 03:46:30 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-23T03:43:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=), allow threads: True {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 23 03:46:30 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Flavor limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 23 03:46:30 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Image limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 23 03:46:30 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Flavor pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 23 03:46:30 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Image pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 23 03:46:30 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 23 03:46:30 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 23 03:46:30 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 23 03:46:30 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Got 1 possible topologies {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 23 03:46:30 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 23 03:46:30 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 23 03:46:30 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:46:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-1420796136',display_name='tempest-AttachVolumeTestJSON-server-1420796136',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-1420796136',id=3,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAlSA3423uMfFY5jWf8qjLn8fKTUZDZCbWqlOeGrb7q4dIPzzhHHma7J2h5uyB7cueX2lnELTDhfWhiWxlTN0oUzC4mH9t+dt+HV9lsJMmuXTTekJoF4InNO5IUJTfbZHQ==',key_name='tempest-keypair-1821880172',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70b031ddc5c94ca98e7161de03bda4b7',ramdisk_id='',reservation_id='r-x3wh6pwx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-276721084',owner_user_name='tempest-AttachVolumeTestJSON-276721084-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:46:23Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e3d689f1c160478ca83bbff3104d8ec3',uuid=08d906c5-1698-4c17-8430-c98f10836398,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "71dae390-2e66-4961-8ce7-1b8fff845732", "address": "fa:16:3e:6e:74:5a", "network": {"id": "5604c946-dd68-4517-9c39-b4d33d86a345", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1424127131-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70b031ddc5c94ca98e7161de03bda4b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap71dae390-2e", "ovs_interfaceid": "71dae390-2e66-4961-8ce7-1b8fff845732", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71428) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 23 03:46:30 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Converting VIF {"id": "71dae390-2e66-4961-8ce7-1b8fff845732", "address": "fa:16:3e:6e:74:5a", "network": {"id": "5604c946-dd68-4517-9c39-b4d33d86a345", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1424127131-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70b031ddc5c94ca98e7161de03bda4b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap71dae390-2e", "ovs_interfaceid": "71dae390-2e66-4961-8ce7-1b8fff845732", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:46:30 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6e:74:5a,bridge_name='br-int',has_traffic_filtering=True,id=71dae390-2e66-4961-8ce7-1b8fff845732,network=Network(5604c946-dd68-4517-9c39-b4d33d86a345),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71dae390-2e') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:46:30 user nova-compute[71428]: DEBUG nova.objects.instance [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lazy-loading 'pci_devices' on Instance uuid 08d906c5-1698-4c17-8430-c98f10836398 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:46:30 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 08d906c5-1698-4c17-8430-c98f10836398] End _get_guest_xml xml= Apr 23 03:46:30 user nova-compute[71428]: 08d906c5-1698-4c17-8430-c98f10836398 Apr 23 03:46:30 user nova-compute[71428]: instance-00000003 Apr 23 03:46:30 user nova-compute[71428]: 131072 Apr 23 03:46:30 user nova-compute[71428]: 1 Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: tempest-AttachVolumeTestJSON-server-1420796136 Apr 23 03:46:30 user nova-compute[71428]: 2023-04-23 03:46:30 Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: 128 Apr 23 03:46:30 user nova-compute[71428]: 1 Apr 23 03:46:30 user nova-compute[71428]: 0 Apr 23 03:46:30 user nova-compute[71428]: 0 Apr 23 03:46:30 user nova-compute[71428]: 1 Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: tempest-AttachVolumeTestJSON-276721084-project-member Apr 23 03:46:30 user nova-compute[71428]: tempest-AttachVolumeTestJSON-276721084 Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: OpenStack Foundation Apr 23 03:46:30 user nova-compute[71428]: OpenStack Nova Apr 23 03:46:30 user nova-compute[71428]: 0.0.0 Apr 23 03:46:30 user nova-compute[71428]: 08d906c5-1698-4c17-8430-c98f10836398 Apr 23 03:46:30 user nova-compute[71428]: 08d906c5-1698-4c17-8430-c98f10836398 Apr 23 03:46:30 user nova-compute[71428]: Virtual Machine Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: hvm Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Nehalem Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: /dev/urandom Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: Apr 23 03:46:30 user nova-compute[71428]: {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 23 03:46:30 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:46:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-1420796136',display_name='tempest-AttachVolumeTestJSON-server-1420796136',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-1420796136',id=3,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAlSA3423uMfFY5jWf8qjLn8fKTUZDZCbWqlOeGrb7q4dIPzzhHHma7J2h5uyB7cueX2lnELTDhfWhiWxlTN0oUzC4mH9t+dt+HV9lsJMmuXTTekJoF4InNO5IUJTfbZHQ==',key_name='tempest-keypair-1821880172',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70b031ddc5c94ca98e7161de03bda4b7',ramdisk_id='',reservation_id='r-x3wh6pwx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-276721084',owner_user_name='tempest-AttachVolumeTestJSON-276721084-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:46:23Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e3d689f1c160478ca83bbff3104d8ec3',uuid=08d906c5-1698-4c17-8430-c98f10836398,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "71dae390-2e66-4961-8ce7-1b8fff845732", "address": "fa:16:3e:6e:74:5a", "network": {"id": "5604c946-dd68-4517-9c39-b4d33d86a345", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1424127131-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70b031ddc5c94ca98e7161de03bda4b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap71dae390-2e", "ovs_interfaceid": "71dae390-2e66-4961-8ce7-1b8fff845732", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 23 03:46:30 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Converting VIF {"id": "71dae390-2e66-4961-8ce7-1b8fff845732", "address": "fa:16:3e:6e:74:5a", "network": {"id": "5604c946-dd68-4517-9c39-b4d33d86a345", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1424127131-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70b031ddc5c94ca98e7161de03bda4b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap71dae390-2e", "ovs_interfaceid": "71dae390-2e66-4961-8ce7-1b8fff845732", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:46:30 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6e:74:5a,bridge_name='br-int',has_traffic_filtering=True,id=71dae390-2e66-4961-8ce7-1b8fff845732,network=Network(5604c946-dd68-4517-9c39-b4d33d86a345),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71dae390-2e') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:46:30 user nova-compute[71428]: DEBUG os_vif [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6e:74:5a,bridge_name='br-int',has_traffic_filtering=True,id=71dae390-2e66-4961-8ce7-1b8fff845732,network=Network(5604c946-dd68-4517-9c39-b4d33d86a345),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71dae390-2e') {{(pid=71428) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 23 03:46:30 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:30 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:46:30 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 23 03:46:30 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:30 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap71dae390-2e, may_exist=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:46:30 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap71dae390-2e, col_values=(('external_ids', {'iface-id': '71dae390-2e66-4961-8ce7-1b8fff845732', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6e:74:5a', 'vm-uuid': '08d906c5-1698-4c17-8430-c98f10836398'}),)) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:46:30 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:30 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:46:30 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:30 user nova-compute[71428]: INFO os_vif [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6e:74:5a,bridge_name='br-int',has_traffic_filtering=True,id=71dae390-2e66-4961-8ce7-1b8fff845732,network=Network(5604c946-dd68-4517-9c39-b4d33d86a345),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71dae390-2e') Apr 23 03:46:30 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] No BDM found with device name vda, not building metadata. {{(pid=71428) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 23 03:46:30 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] No VIF found with MAC fa:16:3e:6e:74:5a, not building metadata {{(pid=71428) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 23 03:46:30 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:31 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:31 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:32 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:32 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:32 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:32 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:32 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:32 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:32 user nova-compute[71428]: DEBUG nova.network.neutron [req-4d9ceb06-224f-4709-bfcf-0aff0134daf6 req-cf841a3d-bc2a-45d8-be74-2f2e225465ea service nova] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Updated VIF entry in instance network info cache for port 71dae390-2e66-4961-8ce7-1b8fff845732. {{(pid=71428) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 23 03:46:32 user nova-compute[71428]: DEBUG nova.network.neutron [req-4d9ceb06-224f-4709-bfcf-0aff0134daf6 req-cf841a3d-bc2a-45d8-be74-2f2e225465ea service nova] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Updating instance_info_cache with network_info: [{"id": "71dae390-2e66-4961-8ce7-1b8fff845732", "address": "fa:16:3e:6e:74:5a", "network": {"id": "5604c946-dd68-4517-9c39-b4d33d86a345", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1424127131-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70b031ddc5c94ca98e7161de03bda4b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap71dae390-2e", "ovs_interfaceid": "71dae390-2e66-4961-8ce7-1b8fff845732", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:46:32 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-4d9ceb06-224f-4709-bfcf-0aff0134daf6 req-cf841a3d-bc2a-45d8-be74-2f2e225465ea service nova] Releasing lock "refresh_cache-08d906c5-1698-4c17-8430-c98f10836398" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:46:33 user nova-compute[71428]: DEBUG nova.compute.manager [req-afc4795a-692a-4c50-abbb-cab9621d65cf req-403b38f6-9dda-42c0-b80e-a0ab0b6f1187 service nova] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Received event network-vif-plugged-552bc11e-f01b-4ee4-94e0-64ab1e19ad9e {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:46:33 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-afc4795a-692a-4c50-abbb-cab9621d65cf req-403b38f6-9dda-42c0-b80e-a0ab0b6f1187 service nova] Acquiring lock "56d8da41-3e04-465b-a1de-73d9e994682d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:46:33 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-afc4795a-692a-4c50-abbb-cab9621d65cf req-403b38f6-9dda-42c0-b80e-a0ab0b6f1187 service nova] Lock "56d8da41-3e04-465b-a1de-73d9e994682d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:46:33 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-afc4795a-692a-4c50-abbb-cab9621d65cf req-403b38f6-9dda-42c0-b80e-a0ab0b6f1187 service nova] Lock "56d8da41-3e04-465b-a1de-73d9e994682d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:46:33 user nova-compute[71428]: DEBUG nova.compute.manager [req-afc4795a-692a-4c50-abbb-cab9621d65cf req-403b38f6-9dda-42c0-b80e-a0ab0b6f1187 service nova] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] No waiting events found dispatching network-vif-plugged-552bc11e-f01b-4ee4-94e0-64ab1e19ad9e {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:46:33 user nova-compute[71428]: WARNING nova.compute.manager [req-afc4795a-692a-4c50-abbb-cab9621d65cf req-403b38f6-9dda-42c0-b80e-a0ab0b6f1187 service nova] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Received unexpected event network-vif-plugged-552bc11e-f01b-4ee4-94e0-64ab1e19ad9e for instance with vm_state building and task_state spawning. Apr 23 03:46:33 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:33 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:33 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:33 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:33 user nova-compute[71428]: DEBUG nova.compute.manager [req-4dd87f11-a6b4-4b49-9a31-0365e5bc166e req-d36a7420-cbb9-47e6-8e7a-6c3c22c4b605 service nova] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Received event network-vif-plugged-680802e3-0304-496f-927b-855e4167272b {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:46:33 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-4dd87f11-a6b4-4b49-9a31-0365e5bc166e req-d36a7420-cbb9-47e6-8e7a-6c3c22c4b605 service nova] Acquiring lock "f279f3d3-581d-4d6f-924f-4104ec23832a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:46:33 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-4dd87f11-a6b4-4b49-9a31-0365e5bc166e req-d36a7420-cbb9-47e6-8e7a-6c3c22c4b605 service nova] Lock "f279f3d3-581d-4d6f-924f-4104ec23832a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:46:33 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-4dd87f11-a6b4-4b49-9a31-0365e5bc166e req-d36a7420-cbb9-47e6-8e7a-6c3c22c4b605 service nova] Lock "f279f3d3-581d-4d6f-924f-4104ec23832a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:46:33 user nova-compute[71428]: DEBUG nova.compute.manager [req-4dd87f11-a6b4-4b49-9a31-0365e5bc166e req-d36a7420-cbb9-47e6-8e7a-6c3c22c4b605 service nova] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] No waiting events found dispatching network-vif-plugged-680802e3-0304-496f-927b-855e4167272b {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:46:33 user nova-compute[71428]: WARNING nova.compute.manager [req-4dd87f11-a6b4-4b49-9a31-0365e5bc166e req-d36a7420-cbb9-47e6-8e7a-6c3c22c4b605 service nova] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Received unexpected event network-vif-plugged-680802e3-0304-496f-927b-855e4167272b for instance with vm_state building and task_state spawning. Apr 23 03:46:34 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:34 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Resumed> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:46:35 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] VM Resumed (Lifecycle Event) Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.compute.manager [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Instance event wait completed in 0 seconds for {{(pid=71428) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Guest created on hypervisor {{(pid=71428) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.compute.manager [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Instance event wait completed in 0 seconds for {{(pid=71428) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Guest created on hypervisor {{(pid=71428) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 23 03:46:35 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Instance spawned successfully. Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 23 03:46:35 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Instance spawned successfully. Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Found default for hw_cdrom_bus of ide {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Found default for hw_disk_bus of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Found default for hw_input_bus of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Found default for hw_pointer_model of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Found default for hw_video_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Found default for hw_vif_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Found default for hw_cdrom_bus of ide {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Found default for hw_disk_bus of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Found default for hw_input_bus of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Found default for hw_pointer_model of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Found default for hw_video_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Found default for hw_vif_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.compute.manager [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Instance event wait completed in 0 seconds for {{(pid=71428) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Guest created on hypervisor {{(pid=71428) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 23 03:46:35 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Started> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:46:35 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] VM Started (Lifecycle Event) Apr 23 03:46:35 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Instance spawned successfully. Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 23 03:46:35 user nova-compute[71428]: INFO nova.compute.manager [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Took 18.52 seconds to spawn the instance on the hypervisor. Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.compute.manager [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:46:35 user nova-compute[71428]: INFO nova.compute.manager [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Took 19.76 seconds to spawn the instance on the hypervisor. Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.compute.manager [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Found default for hw_cdrom_bus of ide {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Found default for hw_disk_bus of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Found default for hw_input_bus of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Found default for hw_pointer_model of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Found default for hw_video_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Found default for hw_vif_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.compute.manager [req-0693fd3c-a9b8-4928-9d01-83c5e450552b req-79504c55-5796-45e9-aa98-4b04a2808776 service nova] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Received event network-vif-plugged-552bc11e-f01b-4ee4-94e0-64ab1e19ad9e {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-0693fd3c-a9b8-4928-9d01-83c5e450552b req-79504c55-5796-45e9-aa98-4b04a2808776 service nova] Acquiring lock "56d8da41-3e04-465b-a1de-73d9e994682d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-0693fd3c-a9b8-4928-9d01-83c5e450552b req-79504c55-5796-45e9-aa98-4b04a2808776 service nova] Lock "56d8da41-3e04-465b-a1de-73d9e994682d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.003s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-0693fd3c-a9b8-4928-9d01-83c5e450552b req-79504c55-5796-45e9-aa98-4b04a2808776 service nova] Lock "56d8da41-3e04-465b-a1de-73d9e994682d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.compute.manager [req-0693fd3c-a9b8-4928-9d01-83c5e450552b req-79504c55-5796-45e9-aa98-4b04a2808776 service nova] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] No waiting events found dispatching network-vif-plugged-552bc11e-f01b-4ee4-94e0-64ab1e19ad9e {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:46:35 user nova-compute[71428]: WARNING nova.compute.manager [req-0693fd3c-a9b8-4928-9d01-83c5e450552b req-79504c55-5796-45e9-aa98-4b04a2808776 service nova] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Received unexpected event network-vif-plugged-552bc11e-f01b-4ee4-94e0-64ab1e19ad9e for instance with vm_state building and task_state spawning. Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.compute.manager [req-0693fd3c-a9b8-4928-9d01-83c5e450552b req-79504c55-5796-45e9-aa98-4b04a2808776 service nova] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Received event network-vif-plugged-71dae390-2e66-4961-8ce7-1b8fff845732 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-0693fd3c-a9b8-4928-9d01-83c5e450552b req-79504c55-5796-45e9-aa98-4b04a2808776 service nova] Acquiring lock "08d906c5-1698-4c17-8430-c98f10836398-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-0693fd3c-a9b8-4928-9d01-83c5e450552b req-79504c55-5796-45e9-aa98-4b04a2808776 service nova] Lock "08d906c5-1698-4c17-8430-c98f10836398-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-0693fd3c-a9b8-4928-9d01-83c5e450552b req-79504c55-5796-45e9-aa98-4b04a2808776 service nova] Lock "08d906c5-1698-4c17-8430-c98f10836398-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.compute.manager [req-0693fd3c-a9b8-4928-9d01-83c5e450552b req-79504c55-5796-45e9-aa98-4b04a2808776 service nova] [instance: 08d906c5-1698-4c17-8430-c98f10836398] No waiting events found dispatching network-vif-plugged-71dae390-2e66-4961-8ce7-1b8fff845732 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:46:35 user nova-compute[71428]: WARNING nova.compute.manager [req-0693fd3c-a9b8-4928-9d01-83c5e450552b req-79504c55-5796-45e9-aa98-4b04a2808776 service nova] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Received unexpected event network-vif-plugged-71dae390-2e66-4961-8ce7-1b8fff845732 for instance with vm_state building and task_state spawning. Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.compute.manager [req-0693fd3c-a9b8-4928-9d01-83c5e450552b req-79504c55-5796-45e9-aa98-4b04a2808776 service nova] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Received event network-vif-plugged-71dae390-2e66-4961-8ce7-1b8fff845732 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-0693fd3c-a9b8-4928-9d01-83c5e450552b req-79504c55-5796-45e9-aa98-4b04a2808776 service nova] Acquiring lock "08d906c5-1698-4c17-8430-c98f10836398-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-0693fd3c-a9b8-4928-9d01-83c5e450552b req-79504c55-5796-45e9-aa98-4b04a2808776 service nova] Lock "08d906c5-1698-4c17-8430-c98f10836398-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-0693fd3c-a9b8-4928-9d01-83c5e450552b req-79504c55-5796-45e9-aa98-4b04a2808776 service nova] Lock "08d906c5-1698-4c17-8430-c98f10836398-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.compute.manager [req-0693fd3c-a9b8-4928-9d01-83c5e450552b req-79504c55-5796-45e9-aa98-4b04a2808776 service nova] [instance: 08d906c5-1698-4c17-8430-c98f10836398] No waiting events found dispatching network-vif-plugged-71dae390-2e66-4961-8ce7-1b8fff845732 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:46:35 user nova-compute[71428]: WARNING nova.compute.manager [req-0693fd3c-a9b8-4928-9d01-83c5e450552b req-79504c55-5796-45e9-aa98-4b04a2808776 service nova] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Received unexpected event network-vif-plugged-71dae390-2e66-4961-8ce7-1b8fff845732 for instance with vm_state building and task_state spawning. Apr 23 03:46:35 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Resumed> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:46:35 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] VM Resumed (Lifecycle Event) Apr 23 03:46:35 user nova-compute[71428]: INFO nova.compute.manager [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Took 19.81 seconds to build instance. Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-bfdfc11f-4e95-4357-ba10-63a44016b430 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Lock "f279f3d3-581d-4d6f-924f-4104ec23832a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 19.961s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Started> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:46:35 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] VM Started (Lifecycle Event) Apr 23 03:46:35 user nova-compute[71428]: INFO nova.compute.manager [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Took 11.95 seconds to spawn the instance on the hypervisor. Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.compute.manager [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:46:35 user nova-compute[71428]: INFO nova.compute.manager [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Took 20.77 seconds to build instance. Apr 23 03:46:35 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-d9e54769-2452-484c-b0e2-f633607bb3bf tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Lock "56d8da41-3e04-465b-a1de-73d9e994682d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 20.993s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Resumed> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:46:35 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 08d906c5-1698-4c17-8430-c98f10836398] VM Resumed (Lifecycle Event) Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:46:35 user nova-compute[71428]: INFO nova.compute.manager [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Took 12.74 seconds to build instance. Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c76bc28a-f9bd-476f-8654-c91343a03b52 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lock "08d906c5-1698-4c17-8430-c98f10836398" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.866s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Started> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:46:35 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 08d906c5-1698-4c17-8430-c98f10836398] VM Started (Lifecycle Event) Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.compute.manager [req-bf8cc144-36c3-434f-a176-28e53f1b535a req-001dbe34-43d9-459b-96d8-7978ea499f63 service nova] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Received event network-vif-plugged-680802e3-0304-496f-927b-855e4167272b {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-bf8cc144-36c3-434f-a176-28e53f1b535a req-001dbe34-43d9-459b-96d8-7978ea499f63 service nova] Acquiring lock "f279f3d3-581d-4d6f-924f-4104ec23832a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-bf8cc144-36c3-434f-a176-28e53f1b535a req-001dbe34-43d9-459b-96d8-7978ea499f63 service nova] Lock "f279f3d3-581d-4d6f-924f-4104ec23832a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-bf8cc144-36c3-434f-a176-28e53f1b535a req-001dbe34-43d9-459b-96d8-7978ea499f63 service nova] Lock "f279f3d3-581d-4d6f-924f-4104ec23832a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:46:35 user nova-compute[71428]: DEBUG nova.compute.manager [req-bf8cc144-36c3-434f-a176-28e53f1b535a req-001dbe34-43d9-459b-96d8-7978ea499f63 service nova] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] No waiting events found dispatching network-vif-plugged-680802e3-0304-496f-927b-855e4167272b {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:46:35 user nova-compute[71428]: WARNING nova.compute.manager [req-bf8cc144-36c3-434f-a176-28e53f1b535a req-001dbe34-43d9-459b-96d8-7978ea499f63 service nova] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Received unexpected event network-vif-plugged-680802e3-0304-496f-927b-855e4167272b for instance with vm_state active and task_state None. Apr 23 03:46:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:40 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:40 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:40 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:41 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:41 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:41 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:41 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:41 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:41 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:42 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:43 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:45 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:46 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:46 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:47 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:48 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:49 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:50 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:51 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Acquiring lock "ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:46:51 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Lock "ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:46:51 user nova-compute[71428]: DEBUG nova.compute.manager [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Starting instance... {{(pid=71428) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 23 03:46:51 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:46:51 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:46:51 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71428) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 23 03:46:51 user nova-compute[71428]: INFO nova.compute.claims [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Claim successful on node user Apr 23 03:46:51 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:46:52 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:46:52 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.502s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:46:52 user nova-compute[71428]: DEBUG nova.compute.manager [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Start building networks asynchronously for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 23 03:46:52 user nova-compute[71428]: DEBUG nova.compute.manager [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Allocating IP information in the background. {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 23 03:46:52 user nova-compute[71428]: DEBUG nova.network.neutron [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] allocate_for_instance() {{(pid=71428) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 23 03:46:52 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Ignoring supplied device name: /dev/sda. Libvirt can't honour user-supplied dev names Apr 23 03:46:52 user nova-compute[71428]: DEBUG nova.compute.manager [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Start building block device mappings for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 23 03:46:52 user nova-compute[71428]: DEBUG nova.compute.manager [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Start spawning the instance on the hypervisor. {{(pid=71428) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 23 03:46:52 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Creating instance directory {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 23 03:46:52 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Creating image(s) Apr 23 03:46:52 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Acquiring lock "/opt/stack/data/nova/instances/ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:46:52 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Lock "/opt/stack/data/nova/instances/ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:46:52 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Lock "/opt/stack/data/nova/instances/ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:46:52 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Acquiring lock "ae63d1a49fce0b03cf1cd9e2c2db64de02c96b52" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:46:52 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Lock "ae63d1a49fce0b03cf1cd9e2c2db64de02c96b52" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:46:52 user nova-compute[71428]: DEBUG nova.policy [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fc51570d0e7c4617ac6b8fa428b9660f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0f253a2e878a45d99dd3cbda86c0c6ff', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71428) authorize /opt/stack/nova/nova/policy.py:203}} Apr 23 03:46:52 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ae63d1a49fce0b03cf1cd9e2c2db64de02c96b52.part --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:46:53 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ae63d1a49fce0b03cf1cd9e2c2db64de02c96b52.part --force-share --output=json" returned: 0 in 0.310s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:46:53 user nova-compute[71428]: DEBUG nova.virt.images [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] 5cb75aca-167f-494a-9dae-c96fc146c1ab was qcow2, converting to raw {{(pid=71428) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Apr 23 03:46:53 user nova-compute[71428]: DEBUG nova.privsep.utils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71428) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 23 03:46:53 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/ae63d1a49fce0b03cf1cd9e2c2db64de02c96b52.part /opt/stack/data/nova/instances/_base/ae63d1a49fce0b03cf1cd9e2c2db64de02c96b52.converted {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:46:53 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/ae63d1a49fce0b03cf1cd9e2c2db64de02c96b52.part /opt/stack/data/nova/instances/_base/ae63d1a49fce0b03cf1cd9e2c2db64de02c96b52.converted" returned: 0 in 0.131s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:46:53 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ae63d1a49fce0b03cf1cd9e2c2db64de02c96b52.converted --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:46:53 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ae63d1a49fce0b03cf1cd9e2c2db64de02c96b52.converted --force-share --output=json" returned: 0 in 0.138s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:46:53 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Lock "ae63d1a49fce0b03cf1cd9e2c2db64de02c96b52" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 1.157s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:46:53 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ae63d1a49fce0b03cf1cd9e2c2db64de02c96b52 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:46:53 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:53 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ae63d1a49fce0b03cf1cd9e2c2db64de02c96b52 --force-share --output=json" returned: 0 in 0.153s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:46:53 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Acquiring lock "ae63d1a49fce0b03cf1cd9e2c2db64de02c96b52" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:46:53 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Lock "ae63d1a49fce0b03cf1cd9e2c2db64de02c96b52" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:46:53 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ae63d1a49fce0b03cf1cd9e2c2db64de02c96b52 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:46:53 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ae63d1a49fce0b03cf1cd9e2c2db64de02c96b52 --force-share --output=json" returned: 0 in 0.134s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:46:53 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/ae63d1a49fce0b03cf1cd9e2c2db64de02c96b52,backing_fmt=raw /opt/stack/data/nova/instances/ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72/disk 1073741824 {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:46:53 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/ae63d1a49fce0b03cf1cd9e2c2db64de02c96b52,backing_fmt=raw /opt/stack/data/nova/instances/ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72/disk 1073741824" returned: 0 in 0.048s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:46:53 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Lock "ae63d1a49fce0b03cf1cd9e2c2db64de02c96b52" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.190s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:46:53 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ae63d1a49fce0b03cf1cd9e2c2db64de02c96b52 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:46:53 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ae63d1a49fce0b03cf1cd9e2c2db64de02c96b52 --force-share --output=json" returned: 0 in 0.139s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:46:53 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Checking if we can resize image /opt/stack/data/nova/instances/ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72/disk. size=1073741824 {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 23 03:46:53 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:46:54 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72/disk --force-share --output=json" returned: 0 in 0.171s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:46:54 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Cannot resize image /opt/stack/data/nova/instances/ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72/disk to a smaller size. {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 23 03:46:54 user nova-compute[71428]: DEBUG nova.objects.instance [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Lazy-loading 'migration_context' on Instance uuid ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:46:54 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Created local disks {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 23 03:46:54 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Ensure instance console log exists: /opt/stack/data/nova/instances/ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72/console.log {{(pid=71428) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 23 03:46:54 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:46:54 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:46:54 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:46:54 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:54 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:55 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:55 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:55 user nova-compute[71428]: DEBUG nova.network.neutron [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Successfully created port: 3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a {{(pid=71428) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 23 03:46:56 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:57 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:57 user nova-compute[71428]: DEBUG nova.network.neutron [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Successfully updated port: 3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a {{(pid=71428) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 23 03:46:58 user nova-compute[71428]: DEBUG nova.compute.manager [req-777b5ae5-a4d2-4063-8a0b-270d009b815b req-20e01e54-c6b7-42f4-bff3-7bb1c99cd868 service nova] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Received event network-changed-3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:46:58 user nova-compute[71428]: DEBUG nova.compute.manager [req-777b5ae5-a4d2-4063-8a0b-270d009b815b req-20e01e54-c6b7-42f4-bff3-7bb1c99cd868 service nova] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Refreshing instance network info cache due to event network-changed-3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a. {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 23 03:46:58 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-777b5ae5-a4d2-4063-8a0b-270d009b815b req-20e01e54-c6b7-42f4-bff3-7bb1c99cd868 service nova] Acquiring lock "refresh_cache-ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:46:58 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-777b5ae5-a4d2-4063-8a0b-270d009b815b req-20e01e54-c6b7-42f4-bff3-7bb1c99cd868 service nova] Acquired lock "refresh_cache-ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:46:58 user nova-compute[71428]: DEBUG nova.network.neutron [req-777b5ae5-a4d2-4063-8a0b-270d009b815b req-20e01e54-c6b7-42f4-bff3-7bb1c99cd868 service nova] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Refreshing network info cache for port 3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 23 03:46:58 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Acquiring lock "refresh_cache-ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:46:58 user nova-compute[71428]: DEBUG nova.network.neutron [req-777b5ae5-a4d2-4063-8a0b-270d009b815b req-20e01e54-c6b7-42f4-bff3-7bb1c99cd868 service nova] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Instance cache missing network info. {{(pid=71428) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 23 03:46:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:46:59 user nova-compute[71428]: DEBUG nova.network.neutron [req-777b5ae5-a4d2-4063-8a0b-270d009b815b req-20e01e54-c6b7-42f4-bff3-7bb1c99cd868 service nova] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Updating instance_info_cache with network_info: [] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:46:59 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-777b5ae5-a4d2-4063-8a0b-270d009b815b req-20e01e54-c6b7-42f4-bff3-7bb1c99cd868 service nova] Releasing lock "refresh_cache-ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:46:59 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Acquired lock "refresh_cache-ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:46:59 user nova-compute[71428]: DEBUG nova.network.neutron [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Building network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 23 03:46:59 user nova-compute[71428]: DEBUG nova.network.neutron [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Instance cache missing network info. {{(pid=71428) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 23 03:47:00 user nova-compute[71428]: DEBUG nova.network.neutron [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Updating instance_info_cache with network_info: [{"id": "3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a", "address": "fa:16:3e:68:41:17", "network": {"id": "3986d1d8-252f-42bf-99c9-3d8b53a4b69d", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1773818503-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "0f253a2e878a45d99dd3cbda86c0c6ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ba908e8-68", "ovs_interfaceid": "3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:47:00 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Releasing lock "refresh_cache-ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:47:00 user nova-compute[71428]: DEBUG nova.compute.manager [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Instance network_info: |[{"id": "3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a", "address": "fa:16:3e:68:41:17", "network": {"id": "3986d1d8-252f-42bf-99c9-3d8b53a4b69d", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1773818503-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "0f253a2e878a45d99dd3cbda86c0c6ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ba908e8-68", "ovs_interfaceid": "3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 23 03:47:00 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Start _get_guest_xml network_info=[{"id": "3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a", "address": "fa:16:3e:68:41:17", "network": {"id": "3986d1d8-252f-42bf-99c9-3d8b53a4b69d", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1773818503-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "0f253a2e878a45d99dd3cbda86c0c6ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ba908e8-68", "ovs_interfaceid": "3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'scsi', 'cdrom_bus': 'scsi', 'mapping': {'root': {'bus': 'scsi', 'dev': 'sda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'scsi', 'dev': 'sda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'scsi', 'dev': 'sdb', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:46:43Z,direct_url=,disk_format='qcow2',id=5cb75aca-167f-494a-9dae-c96fc146c1ab,min_disk=0,min_ram=0,name='',owner='b23bc5311c8f448099b3a767e96e6bc3',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:46:45Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/sda', 'image': [{'disk_bus': 'scsi', 'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'device_name': '/dev/sda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'image_id': '5cb75aca-167f-494a-9dae-c96fc146c1ab'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 23 03:47:00 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:47:00 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:47:00 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71428) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 23 03:47:00 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-23T03:43:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:46:43Z,direct_url=,disk_format='qcow2',id=5cb75aca-167f-494a-9dae-c96fc146c1ab,min_disk=0,min_ram=0,name='',owner='b23bc5311c8f448099b3a767e96e6bc3',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:46:45Z,virtual_size=,visibility=), allow threads: True {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 23 03:47:00 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Flavor limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 23 03:47:00 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Image limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 23 03:47:00 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Flavor pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 23 03:47:00 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Image pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 23 03:47:00 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 23 03:47:00 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 23 03:47:00 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 23 03:47:00 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Got 1 possible topologies {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 23 03:47:00 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 23 03:47:00 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 23 03:47:00 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2023-04-23T03:46:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-1223917650',display_name='tempest-AttachSCSIVolumeTestJSON-server-1223917650',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachscsivolumetestjson-server-1223917650',id=4,image_ref='5cb75aca-167f-494a-9dae-c96fc146c1ab',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGNSlNQDdoc5OwX5MqEuWXOwleRbjHOg+KZZGXRrtYPfsqJpkBX2tsZzDLx+qRW0+E03swcsJdX8Ei+ZrkxMI0G08tG/YU2etmuLBhObrxLGe/QP38Mzd/FO6rnQsefmhQ==',key_name='tempest-keypair-282378258',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0f253a2e878a45d99dd3cbda86c0c6ff',ramdisk_id='',reservation_id='r-sr4p0ec0',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5cb75aca-167f-494a-9dae-c96fc146c1ab',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_machine_type='pc',image_hw_scsi_model='virtio-scsi',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachSCSIVolumeTestJSON-1319344340',owner_user_name='tempest-AttachSCSIVolumeTestJSON-1319344340-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:46:52Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fc51570d0e7c4617ac6b8fa428b9660f',uuid=ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a", "address": "fa:16:3e:68:41:17", "network": {"id": "3986d1d8-252f-42bf-99c9-3d8b53a4b69d", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1773818503-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "0f253a2e878a45d99dd3cbda86c0c6ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ba908e8-68", "ovs_interfaceid": "3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71428) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 23 03:47:00 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Converting VIF {"id": "3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a", "address": "fa:16:3e:68:41:17", "network": {"id": "3986d1d8-252f-42bf-99c9-3d8b53a4b69d", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1773818503-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "0f253a2e878a45d99dd3cbda86c0c6ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ba908e8-68", "ovs_interfaceid": "3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:47:00 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:68:41:17,bridge_name='br-int',has_traffic_filtering=True,id=3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a,network=Network(3986d1d8-252f-42bf-99c9-3d8b53a4b69d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ba908e8-68') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:47:00 user nova-compute[71428]: DEBUG nova.objects.instance [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Lazy-loading 'pci_devices' on Instance uuid ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:47:00 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] End _get_guest_xml xml= Apr 23 03:47:00 user nova-compute[71428]: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72 Apr 23 03:47:00 user nova-compute[71428]: instance-00000004 Apr 23 03:47:00 user nova-compute[71428]: 131072 Apr 23 03:47:00 user nova-compute[71428]: 1 Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: tempest-AttachSCSIVolumeTestJSON-server-1223917650 Apr 23 03:47:00 user nova-compute[71428]: 2023-04-23 03:47:00 Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: 128 Apr 23 03:47:00 user nova-compute[71428]: 1 Apr 23 03:47:00 user nova-compute[71428]: 0 Apr 23 03:47:00 user nova-compute[71428]: 0 Apr 23 03:47:00 user nova-compute[71428]: 1 Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: tempest-AttachSCSIVolumeTestJSON-1319344340-project-member Apr 23 03:47:00 user nova-compute[71428]: tempest-AttachSCSIVolumeTestJSON-1319344340 Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: OpenStack Foundation Apr 23 03:47:00 user nova-compute[71428]: OpenStack Nova Apr 23 03:47:00 user nova-compute[71428]: 0.0.0 Apr 23 03:47:00 user nova-compute[71428]: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72 Apr 23 03:47:00 user nova-compute[71428]: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72 Apr 23 03:47:00 user nova-compute[71428]: Virtual Machine Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: hvm Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Nehalem Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]:
Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]:
Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: /dev/urandom Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: Apr 23 03:47:00 user nova-compute[71428]: {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 23 03:47:00 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2023-04-23T03:46:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-1223917650',display_name='tempest-AttachSCSIVolumeTestJSON-server-1223917650',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachscsivolumetestjson-server-1223917650',id=4,image_ref='5cb75aca-167f-494a-9dae-c96fc146c1ab',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGNSlNQDdoc5OwX5MqEuWXOwleRbjHOg+KZZGXRrtYPfsqJpkBX2tsZzDLx+qRW0+E03swcsJdX8Ei+ZrkxMI0G08tG/YU2etmuLBhObrxLGe/QP38Mzd/FO6rnQsefmhQ==',key_name='tempest-keypair-282378258',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0f253a2e878a45d99dd3cbda86c0c6ff',ramdisk_id='',reservation_id='r-sr4p0ec0',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5cb75aca-167f-494a-9dae-c96fc146c1ab',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_machine_type='pc',image_hw_scsi_model='virtio-scsi',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachSCSIVolumeTestJSON-1319344340',owner_user_name='tempest-AttachSCSIVolumeTestJSON-1319344340-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:46:52Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fc51570d0e7c4617ac6b8fa428b9660f',uuid=ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a", "address": "fa:16:3e:68:41:17", "network": {"id": "3986d1d8-252f-42bf-99c9-3d8b53a4b69d", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1773818503-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "0f253a2e878a45d99dd3cbda86c0c6ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ba908e8-68", "ovs_interfaceid": "3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 23 03:47:00 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Converting VIF {"id": "3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a", "address": "fa:16:3e:68:41:17", "network": {"id": "3986d1d8-252f-42bf-99c9-3d8b53a4b69d", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1773818503-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "0f253a2e878a45d99dd3cbda86c0c6ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ba908e8-68", "ovs_interfaceid": "3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:47:00 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:68:41:17,bridge_name='br-int',has_traffic_filtering=True,id=3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a,network=Network(3986d1d8-252f-42bf-99c9-3d8b53a4b69d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ba908e8-68') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:47:00 user nova-compute[71428]: DEBUG os_vif [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:41:17,bridge_name='br-int',has_traffic_filtering=True,id=3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a,network=Network(3986d1d8-252f-42bf-99c9-3d8b53a4b69d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ba908e8-68') {{(pid=71428) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 23 03:47:00 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:00 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:47:00 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 23 03:47:00 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:00 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:00 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3ba908e8-68, may_exist=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:47:00 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3ba908e8-68, col_values=(('external_ids', {'iface-id': '3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:68:41:17', 'vm-uuid': 'ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72'}),)) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:47:00 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:00 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:47:00 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:00 user nova-compute[71428]: INFO os_vif [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:41:17,bridge_name='br-int',has_traffic_filtering=True,id=3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a,network=Network(3986d1d8-252f-42bf-99c9-3d8b53a4b69d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ba908e8-68') Apr 23 03:47:00 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] No BDM found with device name sda, not building metadata. {{(pid=71428) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 23 03:47:00 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] No BDM found with device name sdb, not building metadata. {{(pid=71428) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 23 03:47:00 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] No VIF found with MAC fa:16:3e:68:41:17, not building metadata {{(pid=71428) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 23 03:47:00 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Using config drive Apr 23 03:47:00 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Creating config drive at /opt/stack/data/nova/instances/ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72/disk.config Apr 23 03:47:00 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Running cmd (subprocess): genisoimage -o /opt/stack/data/nova/instances/ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 0.0.0 -quiet -J -r -V config-2 /tmp/tmpli5702nl {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:47:00 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] CMD "genisoimage -o /opt/stack/data/nova/instances/ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 0.0.0 -quiet -J -r -V config-2 /tmp/tmpli5702nl" returned: 0 in 0.076s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:47:01 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:02 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:02 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:02 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:02 user nova-compute[71428]: DEBUG nova.compute.manager [req-b3633586-da16-4daf-9ec4-8837c3ce200a req-33d29f98-f691-427f-8b83-7d97c5668830 service nova] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Received event network-vif-plugged-3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:47:02 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-b3633586-da16-4daf-9ec4-8837c3ce200a req-33d29f98-f691-427f-8b83-7d97c5668830 service nova] Acquiring lock "ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:47:02 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-b3633586-da16-4daf-9ec4-8837c3ce200a req-33d29f98-f691-427f-8b83-7d97c5668830 service nova] Lock "ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:47:02 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-b3633586-da16-4daf-9ec4-8837c3ce200a req-33d29f98-f691-427f-8b83-7d97c5668830 service nova] Lock "ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:47:02 user nova-compute[71428]: DEBUG nova.compute.manager [req-b3633586-da16-4daf-9ec4-8837c3ce200a req-33d29f98-f691-427f-8b83-7d97c5668830 service nova] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] No waiting events found dispatching network-vif-plugged-3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:47:02 user nova-compute[71428]: WARNING nova.compute.manager [req-b3633586-da16-4daf-9ec4-8837c3ce200a req-33d29f98-f691-427f-8b83-7d97c5668830 service nova] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Received unexpected event network-vif-plugged-3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a for instance with vm_state building and task_state spawning. Apr 23 03:47:02 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:02 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:03 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:03 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:03 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:47:03 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Cleaning up deleted instances {{(pid=71428) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 23 03:47:03 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] There are 0 instances to clean {{(pid=71428) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 23 03:47:03 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:47:03 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Cleaning up deleted instances with incomplete migration {{(pid=71428) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 23 03:47:03 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:47:04 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Resumed> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:47:04 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] VM Resumed (Lifecycle Event) Apr 23 03:47:04 user nova-compute[71428]: DEBUG nova.compute.manager [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Instance event wait completed in 0 seconds for {{(pid=71428) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 23 03:47:04 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Guest created on hypervisor {{(pid=71428) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 23 03:47:04 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Instance spawned successfully. Apr 23 03:47:04 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Attempting to register defaults for the following image properties: ['hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 23 03:47:04 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Found default for hw_input_bus of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:47:04 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Found default for hw_pointer_model of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:47:04 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Found default for hw_video_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:47:04 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Found default for hw_vif_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:47:04 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:47:04 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:47:04 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 03:47:04 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Started> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:47:04 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] VM Started (Lifecycle Event) Apr 23 03:47:04 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:47:04 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:47:04 user nova-compute[71428]: INFO nova.compute.manager [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Took 12.37 seconds to spawn the instance on the hypervisor. Apr 23 03:47:04 user nova-compute[71428]: DEBUG nova.compute.manager [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:47:04 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 03:47:04 user nova-compute[71428]: INFO nova.compute.manager [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Took 13.26 seconds to build instance. Apr 23 03:47:04 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6f61caac-a599-4d6f-8541-77801b995dda tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Lock "ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 13.403s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:47:04 user nova-compute[71428]: DEBUG nova.compute.manager [req-73dd1c62-c228-4ceb-aa11-67fa4975fe4e req-21f6a07a-7bdc-4859-a112-d233be5df790 service nova] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Received event network-vif-plugged-3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:47:04 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-73dd1c62-c228-4ceb-aa11-67fa4975fe4e req-21f6a07a-7bdc-4859-a112-d233be5df790 service nova] Acquiring lock "ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:47:04 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-73dd1c62-c228-4ceb-aa11-67fa4975fe4e req-21f6a07a-7bdc-4859-a112-d233be5df790 service nova] Lock "ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:47:04 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-73dd1c62-c228-4ceb-aa11-67fa4975fe4e req-21f6a07a-7bdc-4859-a112-d233be5df790 service nova] Lock "ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:47:04 user nova-compute[71428]: DEBUG nova.compute.manager [req-73dd1c62-c228-4ceb-aa11-67fa4975fe4e req-21f6a07a-7bdc-4859-a112-d233be5df790 service nova] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] No waiting events found dispatching network-vif-plugged-3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:47:04 user nova-compute[71428]: WARNING nova.compute.manager [req-73dd1c62-c228-4ceb-aa11-67fa4975fe4e req-21f6a07a-7bdc-4859-a112-d233be5df790 service nova] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Received unexpected event network-vif-plugged-3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a for instance with vm_state active and task_state None. Apr 23 03:47:05 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:05 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:05 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:47:05 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:47:06 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:47:06 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:07 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:07 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:07 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:47:07 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:47:07 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71428) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 23 03:47:08 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:08 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:47:08 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:47:08 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:47:08 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:47:08 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:47:08 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Auditing locally available compute resources for user (node: user) {{(pid=71428) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 23 03:47:08 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f279f3d3-581d-4d6f-924f-4104ec23832a/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:47:09 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f279f3d3-581d-4d6f-924f-4104ec23832a/disk --force-share --output=json" returned: 0 in 0.177s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:47:09 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f279f3d3-581d-4d6f-924f-4104ec23832a/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:47:09 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f279f3d3-581d-4d6f-924f-4104ec23832a/disk --force-share --output=json" returned: 0 in 0.155s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:47:09 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/56d8da41-3e04-465b-a1de-73d9e994682d/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:47:09 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/56d8da41-3e04-465b-a1de-73d9e994682d/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:47:09 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/56d8da41-3e04-465b-a1de-73d9e994682d/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:47:09 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/56d8da41-3e04-465b-a1de-73d9e994682d/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:47:09 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/08d906c5-1698-4c17-8430-c98f10836398/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:47:09 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/08d906c5-1698-4c17-8430-c98f10836398/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:47:09 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/08d906c5-1698-4c17-8430-c98f10836398/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:47:09 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/08d906c5-1698-4c17-8430-c98f10836398/disk --force-share --output=json" returned: 0 in 0.261s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:47:09 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:47:09 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:10 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72/disk --force-share --output=json" returned: 0 in 0.158s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:47:10 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:47:10 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72/disk --force-share --output=json" returned: 0 in 0.159s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:47:10 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:10 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:47:10 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:47:10 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Hypervisor/Node resource view: name=user free_ram=8580MB free_disk=26.371341705322266GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71428) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 23 03:47:10 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:47:10 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:47:11 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 56d8da41-3e04-465b-a1de-73d9e994682d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:47:11 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance f279f3d3-581d-4d6f-924f-4104ec23832a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:47:11 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 08d906c5-1698-4c17-8430-c98f10836398 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:47:11 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:47:11 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Total usable vcpus: 12, total allocated vcpus: 4 {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 23 03:47:11 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Final resource view: name=user phys_ram=16023MB used_ram=1024MB phys_disk=40GB used_disk=4GB total_vcpus=12 used_vcpus=4 pci_stats=[] {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 23 03:47:11 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:47:11 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:47:11 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Compute_service record updated for user:user {{(pid=71428) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 23 03:47:11 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.659s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:47:12 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:12 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:47:12 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Starting heal instance info cache {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 23 03:47:12 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Rebuilding the list of instances to heal {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 23 03:47:12 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "refresh_cache-56d8da41-3e04-465b-a1de-73d9e994682d" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:47:12 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquired lock "refresh_cache-56d8da41-3e04-465b-a1de-73d9e994682d" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:47:12 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Forcefully refreshing network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 23 03:47:12 user nova-compute[71428]: DEBUG nova.objects.instance [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lazy-loading 'info_cache' on Instance uuid 56d8da41-3e04-465b-a1de-73d9e994682d {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:47:12 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:13 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:13 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Updating instance_info_cache with network_info: [{"id": "552bc11e-f01b-4ee4-94e0-64ab1e19ad9e", "address": "fa:16:3e:c8:1e:d4", "network": {"id": "47db66e9-5162-4031-9d71-5e13c16ee002", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1314718942-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c4f91ccdb4da4cb4bc4b55b7ac2189f0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap552bc11e-f0", "ovs_interfaceid": "552bc11e-f01b-4ee4-94e0-64ab1e19ad9e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:47:13 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Releasing lock "refresh_cache-56d8da41-3e04-465b-a1de-73d9e994682d" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:47:13 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Updated the network info_cache for instance {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 23 03:47:13 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:47:14 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:15 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:15 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Acquiring lock "954f41b2-5c67-41a4-b38b-ebf3ad60cac7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:47:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Lock "954f41b2-5c67-41a4-b38b-ebf3ad60cac7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:47:17 user nova-compute[71428]: DEBUG nova.compute.manager [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Starting instance... {{(pid=71428) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 23 03:47:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:47:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:47:17 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71428) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 23 03:47:17 user nova-compute[71428]: INFO nova.compute.claims [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Claim successful on node user Apr 23 03:47:17 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Refreshing inventories for resource provider 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 23 03:47:17 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Updating ProviderTree inventory for provider 3017e09c-9289-4a8e-8061-3ff90149e985 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 23 03:47:17 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Updating inventory in ProviderTree for provider 3017e09c-9289-4a8e-8061-3ff90149e985 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 23 03:47:17 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Refreshing aggregate associations for resource provider 3017e09c-9289-4a8e-8061-3ff90149e985, aggregates: None {{(pid=71428) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 23 03:47:17 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Refreshing trait associations for resource provider 3017e09c-9289-4a8e-8061-3ff90149e985, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE41,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_RESCUE_BFV,COMPUTE_NODE,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_GRAPHICS_MODEL_CIRRUS {{(pid=71428) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 23 03:47:17 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:47:17 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:47:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.512s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:47:17 user nova-compute[71428]: DEBUG nova.compute.manager [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Start building networks asynchronously for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 23 03:47:17 user nova-compute[71428]: DEBUG nova.compute.manager [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Allocating IP information in the background. {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 23 03:47:17 user nova-compute[71428]: DEBUG nova.network.neutron [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] allocate_for_instance() {{(pid=71428) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 23 03:47:17 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 23 03:47:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Acquiring lock "3ec36a95-88ce-4ed1-9726-7e6d98674dec" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:47:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "3ec36a95-88ce-4ed1-9726-7e6d98674dec" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:47:17 user nova-compute[71428]: DEBUG nova.compute.manager [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Starting instance... {{(pid=71428) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 23 03:47:17 user nova-compute[71428]: DEBUG nova.compute.manager [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Start building block device mappings for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 23 03:47:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:47:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:47:17 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71428) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 23 03:47:17 user nova-compute[71428]: INFO nova.compute.claims [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Claim successful on node user Apr 23 03:47:18 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG nova.policy [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cdc8fe94058c46c28d4b3f16dc1e77ed', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6f44fe9ba07e4e2f925d1a8952356e04', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71428) authorize /opt/stack/nova/nova/policy.py:203}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG nova.compute.manager [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Start spawning the instance on the hypervisor. {{(pid=71428) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Creating instance directory {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 23 03:47:18 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Creating image(s) Apr 23 03:47:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Acquiring lock "/opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Lock "/opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Lock "/opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.003s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.150s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Acquiring lock "935dc3474dddbde110531a31510941caddc0ae83" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.145s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/disk 1073741824 {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/disk 1073741824" returned: 0 in 0.053s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.203s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.500s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG nova.compute.manager [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Start building networks asynchronously for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.147s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Checking if we can resize image /opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/disk. size=1073741824 {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG nova.compute.manager [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Allocating IP information in the background. {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG nova.network.neutron [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] allocate_for_instance() {{(pid=71428) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:18 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 23 03:47:18 user nova-compute[71428]: DEBUG nova.compute.manager [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Start building block device mappings for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/disk --force-share --output=json" returned: 0 in 0.170s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Cannot resize image /opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/disk to a smaller size. {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG nova.objects.instance [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Lazy-loading 'migration_context' on Instance uuid 954f41b2-5c67-41a4-b38b-ebf3ad60cac7 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Created local disks {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Ensure instance console log exists: /opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/console.log {{(pid=71428) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG nova.network.neutron [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Successfully created port: 97d945a1-b86e-4a6d-af52-23084b8eb175 {{(pid=71428) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG nova.policy [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0a2c459cad014b07b2613e5e261d88aa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '24fff486a500421397ecb935828582cd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71428) authorize /opt/stack/nova/nova/policy.py:203}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG nova.compute.manager [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Start spawning the instance on the hypervisor. {{(pid=71428) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Creating instance directory {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 23 03:47:18 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Creating image(s) Apr 23 03:47:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Acquiring lock "/opt/stack/data/nova/instances/3ec36a95-88ce-4ed1-9726-7e6d98674dec/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "/opt/stack/data/nova/instances/3ec36a95-88ce-4ed1-9726-7e6d98674dec/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "/opt/stack/data/nova/instances/3ec36a95-88ce-4ed1-9726-7e6d98674dec/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.149s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Acquiring lock "935dc3474dddbde110531a31510941caddc0ae83" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:47:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:47:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.158s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:47:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/3ec36a95-88ce-4ed1-9726-7e6d98674dec/disk 1073741824 {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:47:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/3ec36a95-88ce-4ed1-9726-7e6d98674dec/disk 1073741824" returned: 0 in 0.057s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:47:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.223s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:47:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:47:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.152s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:47:19 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Checking if we can resize image /opt/stack/data/nova/instances/3ec36a95-88ce-4ed1-9726-7e6d98674dec/disk. size=1073741824 {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 23 03:47:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3ec36a95-88ce-4ed1-9726-7e6d98674dec/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:47:19 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3ec36a95-88ce-4ed1-9726-7e6d98674dec/disk --force-share --output=json" returned: 0 in 0.146s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:47:19 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Cannot resize image /opt/stack/data/nova/instances/3ec36a95-88ce-4ed1-9726-7e6d98674dec/disk to a smaller size. {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 23 03:47:19 user nova-compute[71428]: DEBUG nova.objects.instance [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lazy-loading 'migration_context' on Instance uuid 3ec36a95-88ce-4ed1-9726-7e6d98674dec {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:47:19 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Created local disks {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 23 03:47:19 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Ensure instance console log exists: /opt/stack/data/nova/instances/3ec36a95-88ce-4ed1-9726-7e6d98674dec/console.log {{(pid=71428) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 23 03:47:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:47:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:47:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:47:20 user nova-compute[71428]: DEBUG nova.network.neutron [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Successfully created port: af57258e-69b3-405b-a9f6-8d54c1810960 {{(pid=71428) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 23 03:47:20 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:20 user nova-compute[71428]: DEBUG nova.network.neutron [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Successfully updated port: 97d945a1-b86e-4a6d-af52-23084b8eb175 {{(pid=71428) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 23 03:47:20 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Acquiring lock "refresh_cache-954f41b2-5c67-41a4-b38b-ebf3ad60cac7" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:47:20 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Acquired lock "refresh_cache-954f41b2-5c67-41a4-b38b-ebf3ad60cac7" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:47:20 user nova-compute[71428]: DEBUG nova.network.neutron [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Building network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 23 03:47:20 user nova-compute[71428]: DEBUG nova.compute.manager [req-d1a29d9d-2491-461f-911a-3dbf1de6fa09 req-9707e6c7-c385-41c6-8093-fa4d893a3a76 service nova] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Received event network-changed-97d945a1-b86e-4a6d-af52-23084b8eb175 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:47:20 user nova-compute[71428]: DEBUG nova.compute.manager [req-d1a29d9d-2491-461f-911a-3dbf1de6fa09 req-9707e6c7-c385-41c6-8093-fa4d893a3a76 service nova] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Refreshing instance network info cache due to event network-changed-97d945a1-b86e-4a6d-af52-23084b8eb175. {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 23 03:47:20 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-d1a29d9d-2491-461f-911a-3dbf1de6fa09 req-9707e6c7-c385-41c6-8093-fa4d893a3a76 service nova] Acquiring lock "refresh_cache-954f41b2-5c67-41a4-b38b-ebf3ad60cac7" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:47:20 user nova-compute[71428]: DEBUG nova.network.neutron [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Instance cache missing network info. {{(pid=71428) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 23 03:47:20 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG nova.network.neutron [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Updating instance_info_cache with network_info: [{"id": "97d945a1-b86e-4a6d-af52-23084b8eb175", "address": "fa:16:3e:ee:63:58", "network": {"id": "bc639aed-946f-458c-b08d-ce3037f5507c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-558421264-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6f44fe9ba07e4e2f925d1a8952356e04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap97d945a1-b8", "ovs_interfaceid": "97d945a1-b86e-4a6d-af52-23084b8eb175", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Releasing lock "refresh_cache-954f41b2-5c67-41a4-b38b-ebf3ad60cac7" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG nova.compute.manager [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Instance network_info: |[{"id": "97d945a1-b86e-4a6d-af52-23084b8eb175", "address": "fa:16:3e:ee:63:58", "network": {"id": "bc639aed-946f-458c-b08d-ce3037f5507c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-558421264-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6f44fe9ba07e4e2f925d1a8952356e04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap97d945a1-b8", "ovs_interfaceid": "97d945a1-b86e-4a6d-af52-23084b8eb175", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-d1a29d9d-2491-461f-911a-3dbf1de6fa09 req-9707e6c7-c385-41c6-8093-fa4d893a3a76 service nova] Acquired lock "refresh_cache-954f41b2-5c67-41a4-b38b-ebf3ad60cac7" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG nova.network.neutron [req-d1a29d9d-2491-461f-911a-3dbf1de6fa09 req-9707e6c7-c385-41c6-8093-fa4d893a3a76 service nova] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Refreshing network info cache for port 97d945a1-b86e-4a6d-af52-23084b8eb175 {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Start _get_guest_xml network_info=[{"id": "97d945a1-b86e-4a6d-af52-23084b8eb175", "address": "fa:16:3e:ee:63:58", "network": {"id": "bc639aed-946f-458c-b08d-ce3037f5507c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-558421264-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6f44fe9ba07e4e2f925d1a8952356e04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap97d945a1-b8", "ovs_interfaceid": "97d945a1-b86e-4a6d-af52-23084b8eb175", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'image_id': 'e6127373-9931-4277-9458-eceef653ea1e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 23 03:47:21 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:47:21 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:47:21 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71428) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-23T03:43:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=), allow threads: True {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Flavor limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Image limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Flavor pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Image pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Got 1 possible topologies {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:47:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1796045730',display_name='tempest-ServerActionsTestJSON-server-1796045730',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serveractionstestjson-server-1796045730',id=5,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIkvMNG9bJWjIet22ePA1ljN9zECExBTHpeziEsLPMaW1zdmYmHvghy1cyRU7fOajJrIJuDX5HLH+Bcl14bDAwDDg/IMdmy4fis8XPrCxEDMbEbjK+Rs+0/rwp2gYOoLwQ==',key_name='tempest-keypair-1222563018',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6f44fe9ba07e4e2f925d1a8952356e04',ramdisk_id='',reservation_id='r-7q0n4thm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1398779434',owner_user_name='tempest-ServerActionsTestJSON-1398779434-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:47:18Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cdc8fe94058c46c28d4b3f16dc1e77ed',uuid=954f41b2-5c67-41a4-b38b-ebf3ad60cac7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "97d945a1-b86e-4a6d-af52-23084b8eb175", "address": "fa:16:3e:ee:63:58", "network": {"id": "bc639aed-946f-458c-b08d-ce3037f5507c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-558421264-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6f44fe9ba07e4e2f925d1a8952356e04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap97d945a1-b8", "ovs_interfaceid": "97d945a1-b86e-4a6d-af52-23084b8eb175", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71428) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Converting VIF {"id": "97d945a1-b86e-4a6d-af52-23084b8eb175", "address": "fa:16:3e:ee:63:58", "network": {"id": "bc639aed-946f-458c-b08d-ce3037f5507c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-558421264-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6f44fe9ba07e4e2f925d1a8952356e04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap97d945a1-b8", "ovs_interfaceid": "97d945a1-b86e-4a6d-af52-23084b8eb175", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:63:58,bridge_name='br-int',has_traffic_filtering=True,id=97d945a1-b86e-4a6d-af52-23084b8eb175,network=Network(bc639aed-946f-458c-b08d-ce3037f5507c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97d945a1-b8') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG nova.objects.instance [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Lazy-loading 'pci_devices' on Instance uuid 954f41b2-5c67-41a4-b38b-ebf3ad60cac7 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] End _get_guest_xml xml= Apr 23 03:47:21 user nova-compute[71428]: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7 Apr 23 03:47:21 user nova-compute[71428]: instance-00000005 Apr 23 03:47:21 user nova-compute[71428]: 131072 Apr 23 03:47:21 user nova-compute[71428]: 1 Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: tempest-ServerActionsTestJSON-server-1796045730 Apr 23 03:47:21 user nova-compute[71428]: 2023-04-23 03:47:21 Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: 128 Apr 23 03:47:21 user nova-compute[71428]: 1 Apr 23 03:47:21 user nova-compute[71428]: 0 Apr 23 03:47:21 user nova-compute[71428]: 0 Apr 23 03:47:21 user nova-compute[71428]: 1 Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: tempest-ServerActionsTestJSON-1398779434-project-member Apr 23 03:47:21 user nova-compute[71428]: tempest-ServerActionsTestJSON-1398779434 Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: OpenStack Foundation Apr 23 03:47:21 user nova-compute[71428]: OpenStack Nova Apr 23 03:47:21 user nova-compute[71428]: 0.0.0 Apr 23 03:47:21 user nova-compute[71428]: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7 Apr 23 03:47:21 user nova-compute[71428]: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7 Apr 23 03:47:21 user nova-compute[71428]: Virtual Machine Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: hvm Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Nehalem Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: /dev/urandom Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: Apr 23 03:47:21 user nova-compute[71428]: {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:47:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1796045730',display_name='tempest-ServerActionsTestJSON-server-1796045730',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serveractionstestjson-server-1796045730',id=5,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIkvMNG9bJWjIet22ePA1ljN9zECExBTHpeziEsLPMaW1zdmYmHvghy1cyRU7fOajJrIJuDX5HLH+Bcl14bDAwDDg/IMdmy4fis8XPrCxEDMbEbjK+Rs+0/rwp2gYOoLwQ==',key_name='tempest-keypair-1222563018',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6f44fe9ba07e4e2f925d1a8952356e04',ramdisk_id='',reservation_id='r-7q0n4thm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1398779434',owner_user_name='tempest-ServerActionsTestJSON-1398779434-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:47:18Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cdc8fe94058c46c28d4b3f16dc1e77ed',uuid=954f41b2-5c67-41a4-b38b-ebf3ad60cac7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "97d945a1-b86e-4a6d-af52-23084b8eb175", "address": "fa:16:3e:ee:63:58", "network": {"id": "bc639aed-946f-458c-b08d-ce3037f5507c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-558421264-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6f44fe9ba07e4e2f925d1a8952356e04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap97d945a1-b8", "ovs_interfaceid": "97d945a1-b86e-4a6d-af52-23084b8eb175", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Converting VIF {"id": "97d945a1-b86e-4a6d-af52-23084b8eb175", "address": "fa:16:3e:ee:63:58", "network": {"id": "bc639aed-946f-458c-b08d-ce3037f5507c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-558421264-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6f44fe9ba07e4e2f925d1a8952356e04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap97d945a1-b8", "ovs_interfaceid": "97d945a1-b86e-4a6d-af52-23084b8eb175", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:63:58,bridge_name='br-int',has_traffic_filtering=True,id=97d945a1-b86e-4a6d-af52-23084b8eb175,network=Network(bc639aed-946f-458c-b08d-ce3037f5507c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97d945a1-b8') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG os_vif [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:63:58,bridge_name='br-int',has_traffic_filtering=True,id=97d945a1-b86e-4a6d-af52-23084b8eb175,network=Network(bc639aed-946f-458c-b08d-ce3037f5507c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97d945a1-b8') {{(pid=71428) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap97d945a1-b8, may_exist=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap97d945a1-b8, col_values=(('external_ids', {'iface-id': '97d945a1-b86e-4a6d-af52-23084b8eb175', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ee:63:58', 'vm-uuid': '954f41b2-5c67-41a4-b38b-ebf3ad60cac7'}),)) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:21 user nova-compute[71428]: INFO os_vif [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:63:58,bridge_name='br-int',has_traffic_filtering=True,id=97d945a1-b86e-4a6d-af52-23084b8eb175,network=Network(bc639aed-946f-458c-b08d-ce3037f5507c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97d945a1-b8') Apr 23 03:47:21 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] No BDM found with device name vda, not building metadata. {{(pid=71428) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] No VIF found with MAC fa:16:3e:ee:63:58, not building metadata {{(pid=71428) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG nova.network.neutron [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Successfully updated port: af57258e-69b3-405b-a9f6-8d54c1810960 {{(pid=71428) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG nova.compute.manager [req-2ca62dcb-11d8-4b93-bd74-0199d5d6812c req-23a7dd54-cf74-4a66-967d-5f7b206078fb service nova] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Received event network-changed-af57258e-69b3-405b-a9f6-8d54c1810960 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG nova.compute.manager [req-2ca62dcb-11d8-4b93-bd74-0199d5d6812c req-23a7dd54-cf74-4a66-967d-5f7b206078fb service nova] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Refreshing instance network info cache due to event network-changed-af57258e-69b3-405b-a9f6-8d54c1810960. {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-2ca62dcb-11d8-4b93-bd74-0199d5d6812c req-23a7dd54-cf74-4a66-967d-5f7b206078fb service nova] Acquiring lock "refresh_cache-3ec36a95-88ce-4ed1-9726-7e6d98674dec" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-2ca62dcb-11d8-4b93-bd74-0199d5d6812c req-23a7dd54-cf74-4a66-967d-5f7b206078fb service nova] Acquired lock "refresh_cache-3ec36a95-88ce-4ed1-9726-7e6d98674dec" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG nova.network.neutron [req-2ca62dcb-11d8-4b93-bd74-0199d5d6812c req-23a7dd54-cf74-4a66-967d-5f7b206078fb service nova] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Refreshing network info cache for port af57258e-69b3-405b-a9f6-8d54c1810960 {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Acquiring lock "refresh_cache-3ec36a95-88ce-4ed1-9726-7e6d98674dec" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG nova.network.neutron [req-2ca62dcb-11d8-4b93-bd74-0199d5d6812c req-23a7dd54-cf74-4a66-967d-5f7b206078fb service nova] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Instance cache missing network info. {{(pid=71428) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG nova.network.neutron [req-d1a29d9d-2491-461f-911a-3dbf1de6fa09 req-9707e6c7-c385-41c6-8093-fa4d893a3a76 service nova] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Updated VIF entry in instance network info cache for port 97d945a1-b86e-4a6d-af52-23084b8eb175. {{(pid=71428) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG nova.network.neutron [req-d1a29d9d-2491-461f-911a-3dbf1de6fa09 req-9707e6c7-c385-41c6-8093-fa4d893a3a76 service nova] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Updating instance_info_cache with network_info: [{"id": "97d945a1-b86e-4a6d-af52-23084b8eb175", "address": "fa:16:3e:ee:63:58", "network": {"id": "bc639aed-946f-458c-b08d-ce3037f5507c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-558421264-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6f44fe9ba07e4e2f925d1a8952356e04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap97d945a1-b8", "ovs_interfaceid": "97d945a1-b86e-4a6d-af52-23084b8eb175", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-d1a29d9d-2491-461f-911a-3dbf1de6fa09 req-9707e6c7-c385-41c6-8093-fa4d893a3a76 service nova] Releasing lock "refresh_cache-954f41b2-5c67-41a4-b38b-ebf3ad60cac7" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG nova.network.neutron [req-2ca62dcb-11d8-4b93-bd74-0199d5d6812c req-23a7dd54-cf74-4a66-967d-5f7b206078fb service nova] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Updating instance_info_cache with network_info: [] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-2ca62dcb-11d8-4b93-bd74-0199d5d6812c req-23a7dd54-cf74-4a66-967d-5f7b206078fb service nova] Releasing lock "refresh_cache-3ec36a95-88ce-4ed1-9726-7e6d98674dec" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Acquired lock "refresh_cache-3ec36a95-88ce-4ed1-9726-7e6d98674dec" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG nova.network.neutron [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Building network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 23 03:47:21 user nova-compute[71428]: DEBUG nova.network.neutron [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Instance cache missing network info. {{(pid=71428) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG nova.network.neutron [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Updating instance_info_cache with network_info: [{"id": "af57258e-69b3-405b-a9f6-8d54c1810960", "address": "fa:16:3e:b4:62:dc", "network": {"id": "6adb189c-9b46-4da4-a34b-31edf8685498", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-759967402-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "24fff486a500421397ecb935828582cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf57258e-69", "ovs_interfaceid": "af57258e-69b3-405b-a9f6-8d54c1810960", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Releasing lock "refresh_cache-3ec36a95-88ce-4ed1-9726-7e6d98674dec" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG nova.compute.manager [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Instance network_info: |[{"id": "af57258e-69b3-405b-a9f6-8d54c1810960", "address": "fa:16:3e:b4:62:dc", "network": {"id": "6adb189c-9b46-4da4-a34b-31edf8685498", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-759967402-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "24fff486a500421397ecb935828582cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf57258e-69", "ovs_interfaceid": "af57258e-69b3-405b-a9f6-8d54c1810960", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Start _get_guest_xml network_info=[{"id": "af57258e-69b3-405b-a9f6-8d54c1810960", "address": "fa:16:3e:b4:62:dc", "network": {"id": "6adb189c-9b46-4da4-a34b-31edf8685498", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-759967402-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "24fff486a500421397ecb935828582cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf57258e-69", "ovs_interfaceid": "af57258e-69b3-405b-a9f6-8d54c1810960", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'image_id': 'e6127373-9931-4277-9458-eceef653ea1e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 23 03:47:22 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:47:22 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:47:22 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71428) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-23T03:43:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=), allow threads: True {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Flavor limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Image limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Flavor pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Image pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Got 1 possible topologies {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:47:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-8654037',display_name='tempest-AttachVolumeNegativeTest-server-8654037',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-8654037',id=6,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBChmwKzve84HsDc9e1enTU2dyNO3uMaTpP7qkGUnUryRoyEDEy7qYcgSp8YUuV2PVICAmznt0zJV+FBBbxWJ37uTbw3NkoqcDxcyXKpOSwoZ98BfcM/UT5YtXU1qwYpFAw==',key_name='tempest-keypair-2054139165',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='24fff486a500421397ecb935828582cd',ramdisk_id='',reservation_id='r-i401xtxu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-636753786',owner_user_name='tempest-AttachVolumeNegativeTest-636753786-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:47:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0a2c459cad014b07b2613e5e261d88aa',uuid=3ec36a95-88ce-4ed1-9726-7e6d98674dec,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "af57258e-69b3-405b-a9f6-8d54c1810960", "address": "fa:16:3e:b4:62:dc", "network": {"id": "6adb189c-9b46-4da4-a34b-31edf8685498", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-759967402-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "24fff486a500421397ecb935828582cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf57258e-69", "ovs_interfaceid": "af57258e-69b3-405b-a9f6-8d54c1810960", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71428) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Converting VIF {"id": "af57258e-69b3-405b-a9f6-8d54c1810960", "address": "fa:16:3e:b4:62:dc", "network": {"id": "6adb189c-9b46-4da4-a34b-31edf8685498", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-759967402-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "24fff486a500421397ecb935828582cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf57258e-69", "ovs_interfaceid": "af57258e-69b3-405b-a9f6-8d54c1810960", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b4:62:dc,bridge_name='br-int',has_traffic_filtering=True,id=af57258e-69b3-405b-a9f6-8d54c1810960,network=Network(6adb189c-9b46-4da4-a34b-31edf8685498),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf57258e-69') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG nova.objects.instance [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lazy-loading 'pci_devices' on Instance uuid 3ec36a95-88ce-4ed1-9726-7e6d98674dec {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] End _get_guest_xml xml= Apr 23 03:47:22 user nova-compute[71428]: 3ec36a95-88ce-4ed1-9726-7e6d98674dec Apr 23 03:47:22 user nova-compute[71428]: instance-00000006 Apr 23 03:47:22 user nova-compute[71428]: 131072 Apr 23 03:47:22 user nova-compute[71428]: 1 Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: tempest-AttachVolumeNegativeTest-server-8654037 Apr 23 03:47:22 user nova-compute[71428]: 2023-04-23 03:47:22 Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: 128 Apr 23 03:47:22 user nova-compute[71428]: 1 Apr 23 03:47:22 user nova-compute[71428]: 0 Apr 23 03:47:22 user nova-compute[71428]: 0 Apr 23 03:47:22 user nova-compute[71428]: 1 Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: tempest-AttachVolumeNegativeTest-636753786-project-member Apr 23 03:47:22 user nova-compute[71428]: tempest-AttachVolumeNegativeTest-636753786 Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: OpenStack Foundation Apr 23 03:47:22 user nova-compute[71428]: OpenStack Nova Apr 23 03:47:22 user nova-compute[71428]: 0.0.0 Apr 23 03:47:22 user nova-compute[71428]: 3ec36a95-88ce-4ed1-9726-7e6d98674dec Apr 23 03:47:22 user nova-compute[71428]: 3ec36a95-88ce-4ed1-9726-7e6d98674dec Apr 23 03:47:22 user nova-compute[71428]: Virtual Machine Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: hvm Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Nehalem Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: /dev/urandom Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: Apr 23 03:47:22 user nova-compute[71428]: {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:47:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-8654037',display_name='tempest-AttachVolumeNegativeTest-server-8654037',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-8654037',id=6,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBChmwKzve84HsDc9e1enTU2dyNO3uMaTpP7qkGUnUryRoyEDEy7qYcgSp8YUuV2PVICAmznt0zJV+FBBbxWJ37uTbw3NkoqcDxcyXKpOSwoZ98BfcM/UT5YtXU1qwYpFAw==',key_name='tempest-keypair-2054139165',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='24fff486a500421397ecb935828582cd',ramdisk_id='',reservation_id='r-i401xtxu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-636753786',owner_user_name='tempest-AttachVolumeNegativeTest-636753786-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:47:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0a2c459cad014b07b2613e5e261d88aa',uuid=3ec36a95-88ce-4ed1-9726-7e6d98674dec,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "af57258e-69b3-405b-a9f6-8d54c1810960", "address": "fa:16:3e:b4:62:dc", "network": {"id": "6adb189c-9b46-4da4-a34b-31edf8685498", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-759967402-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "24fff486a500421397ecb935828582cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf57258e-69", "ovs_interfaceid": "af57258e-69b3-405b-a9f6-8d54c1810960", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Converting VIF {"id": "af57258e-69b3-405b-a9f6-8d54c1810960", "address": "fa:16:3e:b4:62:dc", "network": {"id": "6adb189c-9b46-4da4-a34b-31edf8685498", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-759967402-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "24fff486a500421397ecb935828582cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf57258e-69", "ovs_interfaceid": "af57258e-69b3-405b-a9f6-8d54c1810960", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b4:62:dc,bridge_name='br-int',has_traffic_filtering=True,id=af57258e-69b3-405b-a9f6-8d54c1810960,network=Network(6adb189c-9b46-4da4-a34b-31edf8685498),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf57258e-69') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG os_vif [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:62:dc,bridge_name='br-int',has_traffic_filtering=True,id=af57258e-69b3-405b-a9f6-8d54c1810960,network=Network(6adb189c-9b46-4da4-a34b-31edf8685498),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf57258e-69') {{(pid=71428) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaf57258e-69, may_exist=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapaf57258e-69, col_values=(('external_ids', {'iface-id': 'af57258e-69b3-405b-a9f6-8d54c1810960', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b4:62:dc', 'vm-uuid': '3ec36a95-88ce-4ed1-9726-7e6d98674dec'}),)) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:22 user nova-compute[71428]: INFO os_vif [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b4:62:dc,bridge_name='br-int',has_traffic_filtering=True,id=af57258e-69b3-405b-a9f6-8d54c1810960,network=Network(6adb189c-9b46-4da4-a34b-31edf8685498),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf57258e-69') Apr 23 03:47:22 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] No BDM found with device name vda, not building metadata. {{(pid=71428) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] No VIF found with MAC fa:16:3e:b4:62:dc, not building metadata {{(pid=71428) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:22 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:23 user nova-compute[71428]: DEBUG nova.compute.manager [req-39a16bcd-5181-4215-accb-ebc29341007f req-919cddbe-ac93-4e47-a461-fb3de0e68b22 service nova] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Received event network-vif-plugged-97d945a1-b86e-4a6d-af52-23084b8eb175 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:47:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-39a16bcd-5181-4215-accb-ebc29341007f req-919cddbe-ac93-4e47-a461-fb3de0e68b22 service nova] Acquiring lock "954f41b2-5c67-41a4-b38b-ebf3ad60cac7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:47:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-39a16bcd-5181-4215-accb-ebc29341007f req-919cddbe-ac93-4e47-a461-fb3de0e68b22 service nova] Lock "954f41b2-5c67-41a4-b38b-ebf3ad60cac7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:47:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-39a16bcd-5181-4215-accb-ebc29341007f req-919cddbe-ac93-4e47-a461-fb3de0e68b22 service nova] Lock "954f41b2-5c67-41a4-b38b-ebf3ad60cac7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:47:23 user nova-compute[71428]: DEBUG nova.compute.manager [req-39a16bcd-5181-4215-accb-ebc29341007f req-919cddbe-ac93-4e47-a461-fb3de0e68b22 service nova] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] No waiting events found dispatching network-vif-plugged-97d945a1-b86e-4a6d-af52-23084b8eb175 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:47:23 user nova-compute[71428]: WARNING nova.compute.manager [req-39a16bcd-5181-4215-accb-ebc29341007f req-919cddbe-ac93-4e47-a461-fb3de0e68b22 service nova] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Received unexpected event network-vif-plugged-97d945a1-b86e-4a6d-af52-23084b8eb175 for instance with vm_state building and task_state spawning. Apr 23 03:47:24 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:24 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:24 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:24 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:25 user nova-compute[71428]: DEBUG nova.compute.manager [req-8f77cc54-656a-45fe-8263-10f179b8cdb6 req-703c5785-3fb1-4134-b943-942574fb18c3 service nova] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Received event network-vif-plugged-af57258e-69b3-405b-a9f6-8d54c1810960 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:47:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-8f77cc54-656a-45fe-8263-10f179b8cdb6 req-703c5785-3fb1-4134-b943-942574fb18c3 service nova] Acquiring lock "3ec36a95-88ce-4ed1-9726-7e6d98674dec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:47:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-8f77cc54-656a-45fe-8263-10f179b8cdb6 req-703c5785-3fb1-4134-b943-942574fb18c3 service nova] Lock "3ec36a95-88ce-4ed1-9726-7e6d98674dec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:47:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-8f77cc54-656a-45fe-8263-10f179b8cdb6 req-703c5785-3fb1-4134-b943-942574fb18c3 service nova] Lock "3ec36a95-88ce-4ed1-9726-7e6d98674dec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:47:25 user nova-compute[71428]: DEBUG nova.compute.manager [req-8f77cc54-656a-45fe-8263-10f179b8cdb6 req-703c5785-3fb1-4134-b943-942574fb18c3 service nova] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] No waiting events found dispatching network-vif-plugged-af57258e-69b3-405b-a9f6-8d54c1810960 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:47:25 user nova-compute[71428]: WARNING nova.compute.manager [req-8f77cc54-656a-45fe-8263-10f179b8cdb6 req-703c5785-3fb1-4134-b943-942574fb18c3 service nova] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Received unexpected event network-vif-plugged-af57258e-69b3-405b-a9f6-8d54c1810960 for instance with vm_state building and task_state spawning. Apr 23 03:47:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Acquiring lock "91756f90-733e-4aa5-9108-d2d8b1d020fe" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:47:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lock "91756f90-733e-4aa5-9108-d2d8b1d020fe" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:47:25 user nova-compute[71428]: DEBUG nova.compute.manager [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Starting instance... {{(pid=71428) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 23 03:47:25 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:25 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:47:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:47:25 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71428) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 23 03:47:25 user nova-compute[71428]: INFO nova.compute.claims [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Claim successful on node user Apr 23 03:47:25 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:47:25 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:47:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.378s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:47:25 user nova-compute[71428]: DEBUG nova.compute.manager [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Start building networks asynchronously for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 23 03:47:25 user nova-compute[71428]: DEBUG nova.compute.manager [req-95bd324c-1752-4d4b-9c9f-6c27fd732508 req-5c74d4c9-692e-418f-afd1-92b0c2d0bbb9 service nova] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Received event network-vif-plugged-97d945a1-b86e-4a6d-af52-23084b8eb175 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:47:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-95bd324c-1752-4d4b-9c9f-6c27fd732508 req-5c74d4c9-692e-418f-afd1-92b0c2d0bbb9 service nova] Acquiring lock "954f41b2-5c67-41a4-b38b-ebf3ad60cac7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:47:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-95bd324c-1752-4d4b-9c9f-6c27fd732508 req-5c74d4c9-692e-418f-afd1-92b0c2d0bbb9 service nova] Lock "954f41b2-5c67-41a4-b38b-ebf3ad60cac7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:47:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-95bd324c-1752-4d4b-9c9f-6c27fd732508 req-5c74d4c9-692e-418f-afd1-92b0c2d0bbb9 service nova] Lock "954f41b2-5c67-41a4-b38b-ebf3ad60cac7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:47:25 user nova-compute[71428]: DEBUG nova.compute.manager [req-95bd324c-1752-4d4b-9c9f-6c27fd732508 req-5c74d4c9-692e-418f-afd1-92b0c2d0bbb9 service nova] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] No waiting events found dispatching network-vif-plugged-97d945a1-b86e-4a6d-af52-23084b8eb175 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:47:25 user nova-compute[71428]: WARNING nova.compute.manager [req-95bd324c-1752-4d4b-9c9f-6c27fd732508 req-5c74d4c9-692e-418f-afd1-92b0c2d0bbb9 service nova] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Received unexpected event network-vif-plugged-97d945a1-b86e-4a6d-af52-23084b8eb175 for instance with vm_state building and task_state spawning. Apr 23 03:47:25 user nova-compute[71428]: DEBUG nova.compute.manager [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Allocating IP information in the background. {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 23 03:47:25 user nova-compute[71428]: DEBUG nova.network.neutron [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] allocate_for_instance() {{(pid=71428) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 23 03:47:25 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 23 03:47:25 user nova-compute[71428]: DEBUG nova.compute.manager [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Start building block device mappings for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 23 03:47:25 user nova-compute[71428]: DEBUG nova.compute.manager [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Start spawning the instance on the hypervisor. {{(pid=71428) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 23 03:47:25 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Creating instance directory {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 23 03:47:25 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Creating image(s) Apr 23 03:47:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Acquiring lock "/opt/stack/data/nova/instances/91756f90-733e-4aa5-9108-d2d8b1d020fe/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:47:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lock "/opt/stack/data/nova/instances/91756f90-733e-4aa5-9108-d2d8b1d020fe/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:47:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lock "/opt/stack/data/nova/instances/91756f90-733e-4aa5-9108-d2d8b1d020fe/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:47:25 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.policy [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3e80c354abd34bd3a28ddaeec9535af2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5eb0a03655cf4aa78e27c81ea4e1c424', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71428) authorize /opt/stack/nova/nova/policy.py:203}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.139s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Acquiring lock "935dc3474dddbde110531a31510941caddc0ae83" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.147s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/91756f90-733e-4aa5-9108-d2d8b1d020fe/disk 1073741824 {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Resumed> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:47:26 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] VM Resumed (Lifecycle Event) Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.compute.manager [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Instance event wait completed in 0 seconds for {{(pid=71428) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Guest created on hypervisor {{(pid=71428) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/91756f90-733e-4aa5-9108-d2d8b1d020fe/disk 1073741824" returned: 0 in 0.054s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.206s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:47:26 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Instance spawned successfully. Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:47:26 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Started> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:47:26 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] VM Started (Lifecycle Event) Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Found default for hw_cdrom_bus of ide {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Found default for hw_disk_bus of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Found default for hw_input_bus of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Found default for hw_pointer_model of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Found default for hw_video_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Found default for hw_vif_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:47:26 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 03:47:26 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.148s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Checking if we can resize image /opt/stack/data/nova/instances/91756f90-733e-4aa5-9108-d2d8b1d020fe/disk. size=1073741824 {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91756f90-733e-4aa5-9108-d2d8b1d020fe/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:47:26 user nova-compute[71428]: INFO nova.compute.manager [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Took 8.52 seconds to spawn the instance on the hypervisor. Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.compute.manager [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:47:26 user nova-compute[71428]: INFO nova.compute.manager [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Took 9.48 seconds to build instance. Apr 23 03:47:26 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91756f90-733e-4aa5-9108-d2d8b1d020fe/disk --force-share --output=json" returned: 0 in 0.145s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Cannot resize image /opt/stack/data/nova/instances/91756f90-733e-4aa5-9108-d2d8b1d020fe/disk to a smaller size. {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.objects.instance [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lazy-loading 'migration_context' on Instance uuid 91756f90-733e-4aa5-9108-d2d8b1d020fe {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c421d738-3b88-4d05-88f0-40b1cf0dc20f tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Lock "954f41b2-5c67-41a4-b38b-ebf3ad60cac7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 9.594s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Created local disks {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Ensure instance console log exists: /opt/stack/data/nova/instances/91756f90-733e-4aa5-9108-d2d8b1d020fe/console.log {{(pid=71428) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Resumed> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:47:26 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] VM Resumed (Lifecycle Event) Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.compute.manager [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Instance event wait completed in 0 seconds for {{(pid=71428) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Guest created on hypervisor {{(pid=71428) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 23 03:47:26 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Instance spawned successfully. Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Found default for hw_cdrom_bus of ide {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Found default for hw_disk_bus of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Found default for hw_input_bus of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Found default for hw_pointer_model of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Found default for hw_video_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Found default for hw_vif_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:47:26 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Started> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:47:26 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] VM Started (Lifecycle Event) Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:47:26 user nova-compute[71428]: INFO nova.compute.manager [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Took 8.12 seconds to spawn the instance on the hypervisor. Apr 23 03:47:26 user nova-compute[71428]: DEBUG nova.compute.manager [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:47:26 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 03:47:26 user nova-compute[71428]: INFO nova.compute.manager [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Took 9.07 seconds to build instance. Apr 23 03:47:27 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-40643321-84d4-4946-a9f5-e0dfdc4ace4f tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "3ec36a95-88ce-4ed1-9726-7e6d98674dec" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 9.188s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:47:27 user nova-compute[71428]: DEBUG nova.compute.manager [req-e257a1c9-e44f-4f53-bc65-4eabdd755d3e req-a299e933-1052-45b1-8e8e-e5ff3fe8d97d service nova] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Received event network-vif-plugged-af57258e-69b3-405b-a9f6-8d54c1810960 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:47:27 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-e257a1c9-e44f-4f53-bc65-4eabdd755d3e req-a299e933-1052-45b1-8e8e-e5ff3fe8d97d service nova] Acquiring lock "3ec36a95-88ce-4ed1-9726-7e6d98674dec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:47:27 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-e257a1c9-e44f-4f53-bc65-4eabdd755d3e req-a299e933-1052-45b1-8e8e-e5ff3fe8d97d service nova] Lock "3ec36a95-88ce-4ed1-9726-7e6d98674dec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:47:27 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-e257a1c9-e44f-4f53-bc65-4eabdd755d3e req-a299e933-1052-45b1-8e8e-e5ff3fe8d97d service nova] Lock "3ec36a95-88ce-4ed1-9726-7e6d98674dec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:47:27 user nova-compute[71428]: DEBUG nova.compute.manager [req-e257a1c9-e44f-4f53-bc65-4eabdd755d3e req-a299e933-1052-45b1-8e8e-e5ff3fe8d97d service nova] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] No waiting events found dispatching network-vif-plugged-af57258e-69b3-405b-a9f6-8d54c1810960 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:47:27 user nova-compute[71428]: WARNING nova.compute.manager [req-e257a1c9-e44f-4f53-bc65-4eabdd755d3e req-a299e933-1052-45b1-8e8e-e5ff3fe8d97d service nova] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Received unexpected event network-vif-plugged-af57258e-69b3-405b-a9f6-8d54c1810960 for instance with vm_state active and task_state None. Apr 23 03:47:27 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:28 user nova-compute[71428]: DEBUG nova.network.neutron [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Successfully created port: b7f45c1f-8b9b-4d06-a389-d7643565e934 {{(pid=71428) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 23 03:47:28 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:30 user nova-compute[71428]: DEBUG nova.network.neutron [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Successfully updated port: b7f45c1f-8b9b-4d06-a389-d7643565e934 {{(pid=71428) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 23 03:47:30 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Acquiring lock "refresh_cache-91756f90-733e-4aa5-9108-d2d8b1d020fe" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:47:30 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Acquired lock "refresh_cache-91756f90-733e-4aa5-9108-d2d8b1d020fe" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:47:30 user nova-compute[71428]: DEBUG nova.network.neutron [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Building network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 23 03:47:30 user nova-compute[71428]: DEBUG nova.compute.manager [req-fe85a720-8f76-4791-bbc1-0c289275afe2 req-d3b2ae38-0dfa-4049-aee9-c28adf6e0378 service nova] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Received event network-changed-b7f45c1f-8b9b-4d06-a389-d7643565e934 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:47:30 user nova-compute[71428]: DEBUG nova.compute.manager [req-fe85a720-8f76-4791-bbc1-0c289275afe2 req-d3b2ae38-0dfa-4049-aee9-c28adf6e0378 service nova] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Refreshing instance network info cache due to event network-changed-b7f45c1f-8b9b-4d06-a389-d7643565e934. {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 23 03:47:30 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-fe85a720-8f76-4791-bbc1-0c289275afe2 req-d3b2ae38-0dfa-4049-aee9-c28adf6e0378 service nova] Acquiring lock "refresh_cache-91756f90-733e-4aa5-9108-d2d8b1d020fe" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:47:30 user nova-compute[71428]: DEBUG nova.network.neutron [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Instance cache missing network info. {{(pid=71428) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 23 03:47:30 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG nova.network.neutron [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Updating instance_info_cache with network_info: [{"id": "b7f45c1f-8b9b-4d06-a389-d7643565e934", "address": "fa:16:3e:29:a9:72", "network": {"id": "dca689d5-53e5-466b-bd5c-4c29b330d27e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2008984306-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5eb0a03655cf4aa78e27c81ea4e1c424", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f45c1f-8b", "ovs_interfaceid": "b7f45c1f-8b9b-4d06-a389-d7643565e934", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Releasing lock "refresh_cache-91756f90-733e-4aa5-9108-d2d8b1d020fe" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG nova.compute.manager [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Instance network_info: |[{"id": "b7f45c1f-8b9b-4d06-a389-d7643565e934", "address": "fa:16:3e:29:a9:72", "network": {"id": "dca689d5-53e5-466b-bd5c-4c29b330d27e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2008984306-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5eb0a03655cf4aa78e27c81ea4e1c424", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f45c1f-8b", "ovs_interfaceid": "b7f45c1f-8b9b-4d06-a389-d7643565e934", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-fe85a720-8f76-4791-bbc1-0c289275afe2 req-d3b2ae38-0dfa-4049-aee9-c28adf6e0378 service nova] Acquired lock "refresh_cache-91756f90-733e-4aa5-9108-d2d8b1d020fe" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG nova.network.neutron [req-fe85a720-8f76-4791-bbc1-0c289275afe2 req-d3b2ae38-0dfa-4049-aee9-c28adf6e0378 service nova] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Refreshing network info cache for port b7f45c1f-8b9b-4d06-a389-d7643565e934 {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Start _get_guest_xml network_info=[{"id": "b7f45c1f-8b9b-4d06-a389-d7643565e934", "address": "fa:16:3e:29:a9:72", "network": {"id": "dca689d5-53e5-466b-bd5c-4c29b330d27e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2008984306-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5eb0a03655cf4aa78e27c81ea4e1c424", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f45c1f-8b", "ovs_interfaceid": "b7f45c1f-8b9b-4d06-a389-d7643565e934", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'image_id': 'e6127373-9931-4277-9458-eceef653ea1e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 23 03:47:31 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:47:31 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:47:31 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71428) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-23T03:43:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=), allow threads: True {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Flavor limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Image limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Flavor pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Image pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Got 1 possible topologies {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:47:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1191600208',display_name='tempest-VolumesAdminNegativeTest-server-1191600208',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-1191600208',id=7,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAkS25YZ2FFDYFsUr7jiuPczrSOLIzJlU2a693RRWPK2F4nPp0/U+VjXRqgfa72DvpZuiWhq7U6qcmOGOvf1l/Ty3NboreHITR0GpknbVZF5naj3IXms4m51pMYQyScIuQ==',key_name='tempest-keypair-780145109',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5eb0a03655cf4aa78e27c81ea4e1c424',ramdisk_id='',reservation_id='r-9jr4wy53',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-594073230',owner_user_name='tempest-VolumesAdminNegativeTest-594073230-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:47:26Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e80c354abd34bd3a28ddaeec9535af2',uuid=91756f90-733e-4aa5-9108-d2d8b1d020fe,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b7f45c1f-8b9b-4d06-a389-d7643565e934", "address": "fa:16:3e:29:a9:72", "network": {"id": "dca689d5-53e5-466b-bd5c-4c29b330d27e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2008984306-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5eb0a03655cf4aa78e27c81ea4e1c424", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f45c1f-8b", "ovs_interfaceid": "b7f45c1f-8b9b-4d06-a389-d7643565e934", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71428) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Converting VIF {"id": "b7f45c1f-8b9b-4d06-a389-d7643565e934", "address": "fa:16:3e:29:a9:72", "network": {"id": "dca689d5-53e5-466b-bd5c-4c29b330d27e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2008984306-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5eb0a03655cf4aa78e27c81ea4e1c424", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f45c1f-8b", "ovs_interfaceid": "b7f45c1f-8b9b-4d06-a389-d7643565e934", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:29:a9:72,bridge_name='br-int',has_traffic_filtering=True,id=b7f45c1f-8b9b-4d06-a389-d7643565e934,network=Network(dca689d5-53e5-466b-bd5c-4c29b330d27e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7f45c1f-8b') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG nova.objects.instance [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lazy-loading 'pci_devices' on Instance uuid 91756f90-733e-4aa5-9108-d2d8b1d020fe {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] End _get_guest_xml xml= Apr 23 03:47:31 user nova-compute[71428]: 91756f90-733e-4aa5-9108-d2d8b1d020fe Apr 23 03:47:31 user nova-compute[71428]: instance-00000007 Apr 23 03:47:31 user nova-compute[71428]: 131072 Apr 23 03:47:31 user nova-compute[71428]: 1 Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: tempest-VolumesAdminNegativeTest-server-1191600208 Apr 23 03:47:31 user nova-compute[71428]: 2023-04-23 03:47:31 Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: 128 Apr 23 03:47:31 user nova-compute[71428]: 1 Apr 23 03:47:31 user nova-compute[71428]: 0 Apr 23 03:47:31 user nova-compute[71428]: 0 Apr 23 03:47:31 user nova-compute[71428]: 1 Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: tempest-VolumesAdminNegativeTest-594073230-project-member Apr 23 03:47:31 user nova-compute[71428]: tempest-VolumesAdminNegativeTest-594073230 Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: OpenStack Foundation Apr 23 03:47:31 user nova-compute[71428]: OpenStack Nova Apr 23 03:47:31 user nova-compute[71428]: 0.0.0 Apr 23 03:47:31 user nova-compute[71428]: 91756f90-733e-4aa5-9108-d2d8b1d020fe Apr 23 03:47:31 user nova-compute[71428]: 91756f90-733e-4aa5-9108-d2d8b1d020fe Apr 23 03:47:31 user nova-compute[71428]: Virtual Machine Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: hvm Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Nehalem Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: /dev/urandom Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: Apr 23 03:47:31 user nova-compute[71428]: {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:47:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1191600208',display_name='tempest-VolumesAdminNegativeTest-server-1191600208',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-1191600208',id=7,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAkS25YZ2FFDYFsUr7jiuPczrSOLIzJlU2a693RRWPK2F4nPp0/U+VjXRqgfa72DvpZuiWhq7U6qcmOGOvf1l/Ty3NboreHITR0GpknbVZF5naj3IXms4m51pMYQyScIuQ==',key_name='tempest-keypair-780145109',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5eb0a03655cf4aa78e27c81ea4e1c424',ramdisk_id='',reservation_id='r-9jr4wy53',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-594073230',owner_user_name='tempest-VolumesAdminNegativeTest-594073230-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:47:26Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e80c354abd34bd3a28ddaeec9535af2',uuid=91756f90-733e-4aa5-9108-d2d8b1d020fe,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b7f45c1f-8b9b-4d06-a389-d7643565e934", "address": "fa:16:3e:29:a9:72", "network": {"id": "dca689d5-53e5-466b-bd5c-4c29b330d27e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2008984306-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5eb0a03655cf4aa78e27c81ea4e1c424", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f45c1f-8b", "ovs_interfaceid": "b7f45c1f-8b9b-4d06-a389-d7643565e934", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Converting VIF {"id": "b7f45c1f-8b9b-4d06-a389-d7643565e934", "address": "fa:16:3e:29:a9:72", "network": {"id": "dca689d5-53e5-466b-bd5c-4c29b330d27e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2008984306-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5eb0a03655cf4aa78e27c81ea4e1c424", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f45c1f-8b", "ovs_interfaceid": "b7f45c1f-8b9b-4d06-a389-d7643565e934", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:29:a9:72,bridge_name='br-int',has_traffic_filtering=True,id=b7f45c1f-8b9b-4d06-a389-d7643565e934,network=Network(dca689d5-53e5-466b-bd5c-4c29b330d27e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7f45c1f-8b') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG os_vif [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:29:a9:72,bridge_name='br-int',has_traffic_filtering=True,id=b7f45c1f-8b9b-4d06-a389-d7643565e934,network=Network(dca689d5-53e5-466b-bd5c-4c29b330d27e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7f45c1f-8b') {{(pid=71428) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb7f45c1f-8b, may_exist=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb7f45c1f-8b, col_values=(('external_ids', {'iface-id': 'b7f45c1f-8b9b-4d06-a389-d7643565e934', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:29:a9:72', 'vm-uuid': '91756f90-733e-4aa5-9108-d2d8b1d020fe'}),)) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:31 user nova-compute[71428]: INFO os_vif [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:29:a9:72,bridge_name='br-int',has_traffic_filtering=True,id=b7f45c1f-8b9b-4d06-a389-d7643565e934,network=Network(dca689d5-53e5-466b-bd5c-4c29b330d27e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7f45c1f-8b') Apr 23 03:47:31 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] No BDM found with device name vda, not building metadata. {{(pid=71428) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] No VIF found with MAC fa:16:3e:29:a9:72, not building metadata {{(pid=71428) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG nova.network.neutron [req-fe85a720-8f76-4791-bbc1-0c289275afe2 req-d3b2ae38-0dfa-4049-aee9-c28adf6e0378 service nova] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Updated VIF entry in instance network info cache for port b7f45c1f-8b9b-4d06-a389-d7643565e934. {{(pid=71428) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG nova.network.neutron [req-fe85a720-8f76-4791-bbc1-0c289275afe2 req-d3b2ae38-0dfa-4049-aee9-c28adf6e0378 service nova] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Updating instance_info_cache with network_info: [{"id": "b7f45c1f-8b9b-4d06-a389-d7643565e934", "address": "fa:16:3e:29:a9:72", "network": {"id": "dca689d5-53e5-466b-bd5c-4c29b330d27e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2008984306-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5eb0a03655cf4aa78e27c81ea4e1c424", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f45c1f-8b", "ovs_interfaceid": "b7f45c1f-8b9b-4d06-a389-d7643565e934", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:47:31 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-fe85a720-8f76-4791-bbc1-0c289275afe2 req-d3b2ae38-0dfa-4049-aee9-c28adf6e0378 service nova] Releasing lock "refresh_cache-91756f90-733e-4aa5-9108-d2d8b1d020fe" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:47:32 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:32 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:32 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:32 user nova-compute[71428]: DEBUG nova.compute.manager [req-fb76d5c1-e412-4697-b91d-de57c7ed7d44 req-094b7672-5560-4d74-9ec4-8ada70a87e0f service nova] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Received event network-vif-plugged-b7f45c1f-8b9b-4d06-a389-d7643565e934 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:47:32 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-fb76d5c1-e412-4697-b91d-de57c7ed7d44 req-094b7672-5560-4d74-9ec4-8ada70a87e0f service nova] Acquiring lock "91756f90-733e-4aa5-9108-d2d8b1d020fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:47:32 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-fb76d5c1-e412-4697-b91d-de57c7ed7d44 req-094b7672-5560-4d74-9ec4-8ada70a87e0f service nova] Lock "91756f90-733e-4aa5-9108-d2d8b1d020fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:47:32 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-fb76d5c1-e412-4697-b91d-de57c7ed7d44 req-094b7672-5560-4d74-9ec4-8ada70a87e0f service nova] Lock "91756f90-733e-4aa5-9108-d2d8b1d020fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:47:32 user nova-compute[71428]: DEBUG nova.compute.manager [req-fb76d5c1-e412-4697-b91d-de57c7ed7d44 req-094b7672-5560-4d74-9ec4-8ada70a87e0f service nova] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] No waiting events found dispatching network-vif-plugged-b7f45c1f-8b9b-4d06-a389-d7643565e934 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:47:32 user nova-compute[71428]: WARNING nova.compute.manager [req-fb76d5c1-e412-4697-b91d-de57c7ed7d44 req-094b7672-5560-4d74-9ec4-8ada70a87e0f service nova] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Received unexpected event network-vif-plugged-b7f45c1f-8b9b-4d06-a389-d7643565e934 for instance with vm_state building and task_state spawning. Apr 23 03:47:33 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:33 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:33 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:33 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:33 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:34 user nova-compute[71428]: DEBUG nova.compute.manager [req-508356aa-4d25-4e85-9197-83b3fb28b1d7 req-b06be3b1-a005-473b-9268-5d5944c39872 service nova] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Received event network-vif-plugged-b7f45c1f-8b9b-4d06-a389-d7643565e934 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:47:34 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-508356aa-4d25-4e85-9197-83b3fb28b1d7 req-b06be3b1-a005-473b-9268-5d5944c39872 service nova] Acquiring lock "91756f90-733e-4aa5-9108-d2d8b1d020fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:47:34 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-508356aa-4d25-4e85-9197-83b3fb28b1d7 req-b06be3b1-a005-473b-9268-5d5944c39872 service nova] Lock "91756f90-733e-4aa5-9108-d2d8b1d020fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:47:34 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-508356aa-4d25-4e85-9197-83b3fb28b1d7 req-b06be3b1-a005-473b-9268-5d5944c39872 service nova] Lock "91756f90-733e-4aa5-9108-d2d8b1d020fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:47:34 user nova-compute[71428]: DEBUG nova.compute.manager [req-508356aa-4d25-4e85-9197-83b3fb28b1d7 req-b06be3b1-a005-473b-9268-5d5944c39872 service nova] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] No waiting events found dispatching network-vif-plugged-b7f45c1f-8b9b-4d06-a389-d7643565e934 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:47:34 user nova-compute[71428]: WARNING nova.compute.manager [req-508356aa-4d25-4e85-9197-83b3fb28b1d7 req-b06be3b1-a005-473b-9268-5d5944c39872 service nova] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Received unexpected event network-vif-plugged-b7f45c1f-8b9b-4d06-a389-d7643565e934 for instance with vm_state building and task_state spawning. Apr 23 03:47:35 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Resumed> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:47:35 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] VM Resumed (Lifecycle Event) Apr 23 03:47:35 user nova-compute[71428]: DEBUG nova.compute.manager [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Instance event wait completed in 0 seconds for {{(pid=71428) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 23 03:47:35 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Guest created on hypervisor {{(pid=71428) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 23 03:47:35 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Instance spawned successfully. Apr 23 03:47:35 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 23 03:47:35 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:47:35 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:47:35 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Found default for hw_cdrom_bus of ide {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:47:35 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Found default for hw_disk_bus of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:47:35 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Found default for hw_input_bus of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:47:35 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Found default for hw_pointer_model of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:47:35 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Found default for hw_video_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:47:35 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Found default for hw_vif_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:47:35 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 03:47:35 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Started> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:47:35 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] VM Started (Lifecycle Event) Apr 23 03:47:35 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:47:35 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:47:35 user nova-compute[71428]: INFO nova.compute.manager [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Took 9.17 seconds to spawn the instance on the hypervisor. Apr 23 03:47:35 user nova-compute[71428]: DEBUG nova.compute.manager [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:47:35 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 03:47:35 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Acquiring lock "6f7cf091-3140-4746-bd1c-95c42ea82f6e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:47:35 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lock "6f7cf091-3140-4746-bd1c-95c42ea82f6e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:47:35 user nova-compute[71428]: INFO nova.compute.manager [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Took 9.94 seconds to build instance. Apr 23 03:47:35 user nova-compute[71428]: DEBUG nova.compute.manager [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Starting instance... {{(pid=71428) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 23 03:47:35 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c2eeaeb7-98dd-4844-917f-ee36a9c69673 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lock "91756f90-733e-4aa5-9108-d2d8b1d020fe" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.082s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:47:35 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:47:35 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:47:35 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71428) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 23 03:47:35 user nova-compute[71428]: INFO nova.compute.claims [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Claim successful on node user Apr 23 03:47:35 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:47:35 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:47:35 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.398s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:47:35 user nova-compute[71428]: DEBUG nova.compute.manager [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Start building networks asynchronously for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 23 03:47:35 user nova-compute[71428]: DEBUG nova.compute.manager [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Allocating IP information in the background. {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 23 03:47:35 user nova-compute[71428]: DEBUG nova.network.neutron [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] allocate_for_instance() {{(pid=71428) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 23 03:47:35 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 23 03:47:35 user nova-compute[71428]: DEBUG nova.compute.manager [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Start building block device mappings for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 23 03:47:35 user nova-compute[71428]: DEBUG nova.policy [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '93a8dbfd8cef4578aff742813ffe901e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8ab0f01751954d04a83b360b2f839716', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71428) authorize /opt/stack/nova/nova/policy.py:203}} Apr 23 03:47:36 user nova-compute[71428]: DEBUG nova.compute.manager [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Start spawning the instance on the hypervisor. {{(pid=71428) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 23 03:47:36 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Creating instance directory {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 23 03:47:36 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Creating image(s) Apr 23 03:47:36 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Acquiring lock "/opt/stack/data/nova/instances/6f7cf091-3140-4746-bd1c-95c42ea82f6e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:47:36 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lock "/opt/stack/data/nova/instances/6f7cf091-3140-4746-bd1c-95c42ea82f6e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:47:36 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lock "/opt/stack/data/nova/instances/6f7cf091-3140-4746-bd1c-95c42ea82f6e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:47:36 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Acquiring lock "97472359e3c61528f7f876622d5dba95f32a6e68" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:47:36 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lock "97472359e3c61528f7f876622d5dba95f32a6e68" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:47:36 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:36 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/97472359e3c61528f7f876622d5dba95f32a6e68.part --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:47:36 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/97472359e3c61528f7f876622d5dba95f32a6e68.part --force-share --output=json" returned: 0 in 0.164s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:47:36 user nova-compute[71428]: DEBUG nova.virt.images [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] 7c47c35a-5c7e-4aee-95e4-4a8cee90e1cd was qcow2, converting to raw {{(pid=71428) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Apr 23 03:47:36 user nova-compute[71428]: DEBUG nova.privsep.utils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71428) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 23 03:47:36 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/97472359e3c61528f7f876622d5dba95f32a6e68.part /opt/stack/data/nova/instances/_base/97472359e3c61528f7f876622d5dba95f32a6e68.converted {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:47:36 user nova-compute[71428]: DEBUG nova.network.neutron [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Successfully created port: fcf25b83-715e-4e6e-a7a3-d7f94072883f {{(pid=71428) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 23 03:47:36 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/97472359e3c61528f7f876622d5dba95f32a6e68.part /opt/stack/data/nova/instances/_base/97472359e3c61528f7f876622d5dba95f32a6e68.converted" returned: 0 in 0.174s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:47:36 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/97472359e3c61528f7f876622d5dba95f32a6e68.converted --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:47:37 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/97472359e3c61528f7f876622d5dba95f32a6e68.converted --force-share --output=json" returned: 0 in 0.159s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:47:37 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lock "97472359e3c61528f7f876622d5dba95f32a6e68" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.864s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:47:37 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/97472359e3c61528f7f876622d5dba95f32a6e68 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:47:37 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/97472359e3c61528f7f876622d5dba95f32a6e68 --force-share --output=json" returned: 0 in 0.152s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:47:37 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Acquiring lock "97472359e3c61528f7f876622d5dba95f32a6e68" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:47:37 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lock "97472359e3c61528f7f876622d5dba95f32a6e68" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.003s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:47:37 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/97472359e3c61528f7f876622d5dba95f32a6e68 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:47:37 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/97472359e3c61528f7f876622d5dba95f32a6e68 --force-share --output=json" returned: 0 in 0.163s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:47:37 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/97472359e3c61528f7f876622d5dba95f32a6e68,backing_fmt=raw /opt/stack/data/nova/instances/6f7cf091-3140-4746-bd1c-95c42ea82f6e/disk 1073741824 {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:47:37 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/97472359e3c61528f7f876622d5dba95f32a6e68,backing_fmt=raw /opt/stack/data/nova/instances/6f7cf091-3140-4746-bd1c-95c42ea82f6e/disk 1073741824" returned: 0 in 0.070s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:47:37 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lock "97472359e3c61528f7f876622d5dba95f32a6e68" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.242s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:47:37 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/97472359e3c61528f7f876622d5dba95f32a6e68 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:47:37 user nova-compute[71428]: DEBUG nova.network.neutron [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Successfully updated port: fcf25b83-715e-4e6e-a7a3-d7f94072883f {{(pid=71428) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 23 03:47:37 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/97472359e3c61528f7f876622d5dba95f32a6e68 --force-share --output=json" returned: 0 in 0.164s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:47:37 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Checking if we can resize image /opt/stack/data/nova/instances/6f7cf091-3140-4746-bd1c-95c42ea82f6e/disk. size=1073741824 {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 23 03:47:37 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6f7cf091-3140-4746-bd1c-95c42ea82f6e/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:47:37 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Acquiring lock "refresh_cache-6f7cf091-3140-4746-bd1c-95c42ea82f6e" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:47:37 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Acquired lock "refresh_cache-6f7cf091-3140-4746-bd1c-95c42ea82f6e" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:47:37 user nova-compute[71428]: DEBUG nova.network.neutron [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Building network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 23 03:47:37 user nova-compute[71428]: DEBUG nova.compute.manager [req-205899d8-e00b-4ab4-afc9-6e273003845a req-74804157-5d91-4fbd-9cd1-730df3d5e67f service nova] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Received event network-changed-fcf25b83-715e-4e6e-a7a3-d7f94072883f {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:47:37 user nova-compute[71428]: DEBUG nova.compute.manager [req-205899d8-e00b-4ab4-afc9-6e273003845a req-74804157-5d91-4fbd-9cd1-730df3d5e67f service nova] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Refreshing instance network info cache due to event network-changed-fcf25b83-715e-4e6e-a7a3-d7f94072883f. {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 23 03:47:37 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-205899d8-e00b-4ab4-afc9-6e273003845a req-74804157-5d91-4fbd-9cd1-730df3d5e67f service nova] Acquiring lock "refresh_cache-6f7cf091-3140-4746-bd1c-95c42ea82f6e" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:47:37 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6f7cf091-3140-4746-bd1c-95c42ea82f6e/disk --force-share --output=json" returned: 0 in 0.149s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:47:37 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Cannot resize image /opt/stack/data/nova/instances/6f7cf091-3140-4746-bd1c-95c42ea82f6e/disk to a smaller size. {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 23 03:47:37 user nova-compute[71428]: DEBUG nova.objects.instance [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lazy-loading 'migration_context' on Instance uuid 6f7cf091-3140-4746-bd1c-95c42ea82f6e {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:47:37 user nova-compute[71428]: DEBUG nova.network.neutron [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Instance cache missing network info. {{(pid=71428) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 23 03:47:37 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Created local disks {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 23 03:47:37 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Ensure instance console log exists: /opt/stack/data/nova/instances/6f7cf091-3140-4746-bd1c-95c42ea82f6e/console.log {{(pid=71428) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 23 03:47:37 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:47:37 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:47:37 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG nova.network.neutron [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Updating instance_info_cache with network_info: [{"id": "fcf25b83-715e-4e6e-a7a3-d7f94072883f", "address": "fa:16:3e:65:a9:56", "network": {"id": "a603a26f-3b06-4792-80ae-73cea9f9d53b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1385818779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8ab0f01751954d04a83b360b2f839716", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfcf25b83-71", "ovs_interfaceid": "fcf25b83-715e-4e6e-a7a3-d7f94072883f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Releasing lock "refresh_cache-6f7cf091-3140-4746-bd1c-95c42ea82f6e" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG nova.compute.manager [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Instance network_info: |[{"id": "fcf25b83-715e-4e6e-a7a3-d7f94072883f", "address": "fa:16:3e:65:a9:56", "network": {"id": "a603a26f-3b06-4792-80ae-73cea9f9d53b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1385818779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8ab0f01751954d04a83b360b2f839716", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfcf25b83-71", "ovs_interfaceid": "fcf25b83-715e-4e6e-a7a3-d7f94072883f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-205899d8-e00b-4ab4-afc9-6e273003845a req-74804157-5d91-4fbd-9cd1-730df3d5e67f service nova] Acquired lock "refresh_cache-6f7cf091-3140-4746-bd1c-95c42ea82f6e" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG nova.network.neutron [req-205899d8-e00b-4ab4-afc9-6e273003845a req-74804157-5d91-4fbd-9cd1-730df3d5e67f service nova] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Refreshing network info cache for port fcf25b83-715e-4e6e-a7a3-d7f94072883f {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Start _get_guest_xml network_info=[{"id": "fcf25b83-715e-4e6e-a7a3-d7f94072883f", "address": "fa:16:3e:65:a9:56", "network": {"id": "a603a26f-3b06-4792-80ae-73cea9f9d53b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1385818779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8ab0f01751954d04a83b360b2f839716", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfcf25b83-71", "ovs_interfaceid": "fcf25b83-715e-4e6e-a7a3-d7f94072883f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:47:32Z,direct_url=,disk_format='qcow2',id=7c47c35a-5c7e-4aee-95e4-4a8cee90e1cd,min_disk=0,min_ram=0,name='tempest-scenario-img--9935587',owner='8ab0f01751954d04a83b360b2f839716',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:47:34Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'image_id': '7c47c35a-5c7e-4aee-95e4-4a8cee90e1cd'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 23 03:47:38 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:47:38 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:47:38 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71428) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-23T03:43:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:47:32Z,direct_url=,disk_format='qcow2',id=7c47c35a-5c7e-4aee-95e4-4a8cee90e1cd,min_disk=0,min_ram=0,name='tempest-scenario-img--9935587',owner='8ab0f01751954d04a83b360b2f839716',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:47:34Z,virtual_size=,visibility=), allow threads: True {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Flavor limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Image limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Flavor pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Image pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Got 1 possible topologies {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:47:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-262802774',display_name='tempest-TestMinimumBasicScenario-server-262802774',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-262802774',id=8,image_ref='7c47c35a-5c7e-4aee-95e4-4a8cee90e1cd',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLKn6a8rESzq46bLmR05LGujkm1bgXgZ+9BeETyXSPwaGi5H5ZetPmCwt6N2oLGkFtyS9BduJD7406fOIM9/S2/jcwpt1OzWCv1MZI9XkD4sbEN2KdlWlfO9OLzEy4zxDA==',key_name='tempest-TestMinimumBasicScenario-1120445103',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8ab0f01751954d04a83b360b2f839716',ramdisk_id='',reservation_id='r-8tf6frub',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7c47c35a-5c7e-4aee-95e4-4a8cee90e1cd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-1558592168',owner_user_name='tempest-TestMinimumBasicScenario-1558592168-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:47:36Z,user_data=None,user_id='93a8dbfd8cef4578aff742813ffe901e',uuid=6f7cf091-3140-4746-bd1c-95c42ea82f6e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fcf25b83-715e-4e6e-a7a3-d7f94072883f", "address": "fa:16:3e:65:a9:56", "network": {"id": "a603a26f-3b06-4792-80ae-73cea9f9d53b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1385818779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8ab0f01751954d04a83b360b2f839716", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfcf25b83-71", "ovs_interfaceid": "fcf25b83-715e-4e6e-a7a3-d7f94072883f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71428) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Converting VIF {"id": "fcf25b83-715e-4e6e-a7a3-d7f94072883f", "address": "fa:16:3e:65:a9:56", "network": {"id": "a603a26f-3b06-4792-80ae-73cea9f9d53b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1385818779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8ab0f01751954d04a83b360b2f839716", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfcf25b83-71", "ovs_interfaceid": "fcf25b83-715e-4e6e-a7a3-d7f94072883f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:a9:56,bridge_name='br-int',has_traffic_filtering=True,id=fcf25b83-715e-4e6e-a7a3-d7f94072883f,network=Network(a603a26f-3b06-4792-80ae-73cea9f9d53b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfcf25b83-71') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG nova.objects.instance [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lazy-loading 'pci_devices' on Instance uuid 6f7cf091-3140-4746-bd1c-95c42ea82f6e {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] End _get_guest_xml xml= Apr 23 03:47:38 user nova-compute[71428]: 6f7cf091-3140-4746-bd1c-95c42ea82f6e Apr 23 03:47:38 user nova-compute[71428]: instance-00000008 Apr 23 03:47:38 user nova-compute[71428]: 131072 Apr 23 03:47:38 user nova-compute[71428]: 1 Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: tempest-TestMinimumBasicScenario-server-262802774 Apr 23 03:47:38 user nova-compute[71428]: 2023-04-23 03:47:38 Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: 128 Apr 23 03:47:38 user nova-compute[71428]: 1 Apr 23 03:47:38 user nova-compute[71428]: 0 Apr 23 03:47:38 user nova-compute[71428]: 0 Apr 23 03:47:38 user nova-compute[71428]: 1 Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: tempest-TestMinimumBasicScenario-1558592168-project-member Apr 23 03:47:38 user nova-compute[71428]: tempest-TestMinimumBasicScenario-1558592168 Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: OpenStack Foundation Apr 23 03:47:38 user nova-compute[71428]: OpenStack Nova Apr 23 03:47:38 user nova-compute[71428]: 0.0.0 Apr 23 03:47:38 user nova-compute[71428]: 6f7cf091-3140-4746-bd1c-95c42ea82f6e Apr 23 03:47:38 user nova-compute[71428]: 6f7cf091-3140-4746-bd1c-95c42ea82f6e Apr 23 03:47:38 user nova-compute[71428]: Virtual Machine Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: hvm Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Nehalem Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: /dev/urandom Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: Apr 23 03:47:38 user nova-compute[71428]: {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:47:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-262802774',display_name='tempest-TestMinimumBasicScenario-server-262802774',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-262802774',id=8,image_ref='7c47c35a-5c7e-4aee-95e4-4a8cee90e1cd',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLKn6a8rESzq46bLmR05LGujkm1bgXgZ+9BeETyXSPwaGi5H5ZetPmCwt6N2oLGkFtyS9BduJD7406fOIM9/S2/jcwpt1OzWCv1MZI9XkD4sbEN2KdlWlfO9OLzEy4zxDA==',key_name='tempest-TestMinimumBasicScenario-1120445103',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8ab0f01751954d04a83b360b2f839716',ramdisk_id='',reservation_id='r-8tf6frub',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7c47c35a-5c7e-4aee-95e4-4a8cee90e1cd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-1558592168',owner_user_name='tempest-TestMinimumBasicScenario-1558592168-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:47:36Z,user_data=None,user_id='93a8dbfd8cef4578aff742813ffe901e',uuid=6f7cf091-3140-4746-bd1c-95c42ea82f6e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fcf25b83-715e-4e6e-a7a3-d7f94072883f", "address": "fa:16:3e:65:a9:56", "network": {"id": "a603a26f-3b06-4792-80ae-73cea9f9d53b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1385818779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8ab0f01751954d04a83b360b2f839716", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfcf25b83-71", "ovs_interfaceid": "fcf25b83-715e-4e6e-a7a3-d7f94072883f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Converting VIF {"id": "fcf25b83-715e-4e6e-a7a3-d7f94072883f", "address": "fa:16:3e:65:a9:56", "network": {"id": "a603a26f-3b06-4792-80ae-73cea9f9d53b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1385818779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8ab0f01751954d04a83b360b2f839716", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfcf25b83-71", "ovs_interfaceid": "fcf25b83-715e-4e6e-a7a3-d7f94072883f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:a9:56,bridge_name='br-int',has_traffic_filtering=True,id=fcf25b83-715e-4e6e-a7a3-d7f94072883f,network=Network(a603a26f-3b06-4792-80ae-73cea9f9d53b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfcf25b83-71') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG os_vif [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:a9:56,bridge_name='br-int',has_traffic_filtering=True,id=fcf25b83-715e-4e6e-a7a3-d7f94072883f,network=Network(a603a26f-3b06-4792-80ae-73cea9f9d53b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfcf25b83-71') {{(pid=71428) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfcf25b83-71, may_exist=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfcf25b83-71, col_values=(('external_ids', {'iface-id': 'fcf25b83-715e-4e6e-a7a3-d7f94072883f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:65:a9:56', 'vm-uuid': '6f7cf091-3140-4746-bd1c-95c42ea82f6e'}),)) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:38 user nova-compute[71428]: INFO os_vif [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:a9:56,bridge_name='br-int',has_traffic_filtering=True,id=fcf25b83-715e-4e6e-a7a3-d7f94072883f,network=Network(a603a26f-3b06-4792-80ae-73cea9f9d53b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfcf25b83-71') Apr 23 03:47:38 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] No BDM found with device name vda, not building metadata. {{(pid=71428) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] No VIF found with MAC fa:16:3e:65:a9:56, not building metadata {{(pid=71428) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG nova.network.neutron [req-205899d8-e00b-4ab4-afc9-6e273003845a req-74804157-5d91-4fbd-9cd1-730df3d5e67f service nova] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Updated VIF entry in instance network info cache for port fcf25b83-715e-4e6e-a7a3-d7f94072883f. {{(pid=71428) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG nova.network.neutron [req-205899d8-e00b-4ab4-afc9-6e273003845a req-74804157-5d91-4fbd-9cd1-730df3d5e67f service nova] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Updating instance_info_cache with network_info: [{"id": "fcf25b83-715e-4e6e-a7a3-d7f94072883f", "address": "fa:16:3e:65:a9:56", "network": {"id": "a603a26f-3b06-4792-80ae-73cea9f9d53b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1385818779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8ab0f01751954d04a83b360b2f839716", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfcf25b83-71", "ovs_interfaceid": "fcf25b83-715e-4e6e-a7a3-d7f94072883f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:38 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-205899d8-e00b-4ab4-afc9-6e273003845a req-74804157-5d91-4fbd-9cd1-730df3d5e67f service nova] Releasing lock "refresh_cache-6f7cf091-3140-4746-bd1c-95c42ea82f6e" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:47:39 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:39 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:39 user nova-compute[71428]: DEBUG nova.compute.manager [req-7f614295-35c4-4685-9bf7-d4266aa6bccd req-815d1cf8-380c-4a99-a7cf-49a4b9517ae8 service nova] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Received event network-vif-plugged-fcf25b83-715e-4e6e-a7a3-d7f94072883f {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:47:39 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-7f614295-35c4-4685-9bf7-d4266aa6bccd req-815d1cf8-380c-4a99-a7cf-49a4b9517ae8 service nova] Acquiring lock "6f7cf091-3140-4746-bd1c-95c42ea82f6e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:47:40 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-7f614295-35c4-4685-9bf7-d4266aa6bccd req-815d1cf8-380c-4a99-a7cf-49a4b9517ae8 service nova] Lock "6f7cf091-3140-4746-bd1c-95c42ea82f6e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:47:40 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-7f614295-35c4-4685-9bf7-d4266aa6bccd req-815d1cf8-380c-4a99-a7cf-49a4b9517ae8 service nova] Lock "6f7cf091-3140-4746-bd1c-95c42ea82f6e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:47:40 user nova-compute[71428]: DEBUG nova.compute.manager [req-7f614295-35c4-4685-9bf7-d4266aa6bccd req-815d1cf8-380c-4a99-a7cf-49a4b9517ae8 service nova] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] No waiting events found dispatching network-vif-plugged-fcf25b83-715e-4e6e-a7a3-d7f94072883f {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:47:40 user nova-compute[71428]: WARNING nova.compute.manager [req-7f614295-35c4-4685-9bf7-d4266aa6bccd req-815d1cf8-380c-4a99-a7cf-49a4b9517ae8 service nova] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Received unexpected event network-vif-plugged-fcf25b83-715e-4e6e-a7a3-d7f94072883f for instance with vm_state building and task_state spawning. Apr 23 03:47:40 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:40 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:40 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:40 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:41 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Resumed> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:47:41 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] VM Resumed (Lifecycle Event) Apr 23 03:47:41 user nova-compute[71428]: DEBUG nova.compute.manager [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Instance event wait completed in 0 seconds for {{(pid=71428) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 23 03:47:41 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Guest created on hypervisor {{(pid=71428) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 23 03:47:41 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Instance spawned successfully. Apr 23 03:47:41 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 23 03:47:41 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:47:41 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:47:41 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Found default for hw_cdrom_bus of ide {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:47:41 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Found default for hw_disk_bus of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:47:41 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Found default for hw_input_bus of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:47:41 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Found default for hw_pointer_model of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:47:41 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Found default for hw_video_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:47:41 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Found default for hw_vif_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:47:41 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 03:47:41 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Started> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:47:41 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] VM Started (Lifecycle Event) Apr 23 03:47:42 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:47:42 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:47:42 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 03:47:42 user nova-compute[71428]: DEBUG nova.compute.manager [req-cf3d98a2-e50f-4ce6-a589-c0c40c1088f2 req-bd8ced29-d204-4b48-90a5-04dc70f428de service nova] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Received event network-vif-plugged-fcf25b83-715e-4e6e-a7a3-d7f94072883f {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:47:42 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-cf3d98a2-e50f-4ce6-a589-c0c40c1088f2 req-bd8ced29-d204-4b48-90a5-04dc70f428de service nova] Acquiring lock "6f7cf091-3140-4746-bd1c-95c42ea82f6e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:47:42 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-cf3d98a2-e50f-4ce6-a589-c0c40c1088f2 req-bd8ced29-d204-4b48-90a5-04dc70f428de service nova] Lock "6f7cf091-3140-4746-bd1c-95c42ea82f6e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:47:42 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-cf3d98a2-e50f-4ce6-a589-c0c40c1088f2 req-bd8ced29-d204-4b48-90a5-04dc70f428de service nova] Lock "6f7cf091-3140-4746-bd1c-95c42ea82f6e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:47:42 user nova-compute[71428]: DEBUG nova.compute.manager [req-cf3d98a2-e50f-4ce6-a589-c0c40c1088f2 req-bd8ced29-d204-4b48-90a5-04dc70f428de service nova] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] No waiting events found dispatching network-vif-plugged-fcf25b83-715e-4e6e-a7a3-d7f94072883f {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:47:42 user nova-compute[71428]: WARNING nova.compute.manager [req-cf3d98a2-e50f-4ce6-a589-c0c40c1088f2 req-bd8ced29-d204-4b48-90a5-04dc70f428de service nova] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Received unexpected event network-vif-plugged-fcf25b83-715e-4e6e-a7a3-d7f94072883f for instance with vm_state building and task_state spawning. Apr 23 03:47:42 user nova-compute[71428]: INFO nova.compute.manager [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Took 5.93 seconds to spawn the instance on the hypervisor. Apr 23 03:47:42 user nova-compute[71428]: DEBUG nova.compute.manager [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:47:42 user nova-compute[71428]: INFO nova.compute.manager [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Took 6.80 seconds to build instance. Apr 23 03:47:42 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ba6ac190-cce1-464b-a166-1e096f604f3b tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lock "6f7cf091-3140-4746-bd1c-95c42ea82f6e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.926s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:47:43 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:43 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:48 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:48 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:53 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:53 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:47:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:03 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:03 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:05 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:48:05 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:48:05 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:48:06 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-5f6b52b7-298c-452e-a7f4-ae50dd891271 tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Acquiring lock "56d8da41-3e04-465b-a1de-73d9e994682d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:48:06 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-5f6b52b7-298c-452e-a7f4-ae50dd891271 tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Lock "56d8da41-3e04-465b-a1de-73d9e994682d" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:48:06 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-5f6b52b7-298c-452e-a7f4-ae50dd891271 tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Acquiring lock "56d8da41-3e04-465b-a1de-73d9e994682d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:48:06 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-5f6b52b7-298c-452e-a7f4-ae50dd891271 tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Lock "56d8da41-3e04-465b-a1de-73d9e994682d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.002s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:48:06 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-5f6b52b7-298c-452e-a7f4-ae50dd891271 tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Lock "56d8da41-3e04-465b-a1de-73d9e994682d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:48:06 user nova-compute[71428]: INFO nova.compute.manager [None req-5f6b52b7-298c-452e-a7f4-ae50dd891271 tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Terminating instance Apr 23 03:48:06 user nova-compute[71428]: DEBUG nova.compute.manager [None req-5f6b52b7-298c-452e-a7f4-ae50dd891271 tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Start destroying the instance on the hypervisor. {{(pid=71428) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 23 03:48:06 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:06 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:06 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:06 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:07 user nova-compute[71428]: DEBUG nova.compute.manager [req-affcbc82-ce55-49cc-a2c2-73a8796726ba req-8d0db0d5-9859-46fe-ab2f-fdb33f1c42b9 service nova] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Received event network-vif-unplugged-552bc11e-f01b-4ee4-94e0-64ab1e19ad9e {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:48:07 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-affcbc82-ce55-49cc-a2c2-73a8796726ba req-8d0db0d5-9859-46fe-ab2f-fdb33f1c42b9 service nova] Acquiring lock "56d8da41-3e04-465b-a1de-73d9e994682d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:48:07 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-affcbc82-ce55-49cc-a2c2-73a8796726ba req-8d0db0d5-9859-46fe-ab2f-fdb33f1c42b9 service nova] Lock "56d8da41-3e04-465b-a1de-73d9e994682d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:48:07 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-affcbc82-ce55-49cc-a2c2-73a8796726ba req-8d0db0d5-9859-46fe-ab2f-fdb33f1c42b9 service nova] Lock "56d8da41-3e04-465b-a1de-73d9e994682d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:48:07 user nova-compute[71428]: DEBUG nova.compute.manager [req-affcbc82-ce55-49cc-a2c2-73a8796726ba req-8d0db0d5-9859-46fe-ab2f-fdb33f1c42b9 service nova] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] No waiting events found dispatching network-vif-unplugged-552bc11e-f01b-4ee4-94e0-64ab1e19ad9e {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:48:07 user nova-compute[71428]: DEBUG nova.compute.manager [req-affcbc82-ce55-49cc-a2c2-73a8796726ba req-8d0db0d5-9859-46fe-ab2f-fdb33f1c42b9 service nova] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Received event network-vif-unplugged-552bc11e-f01b-4ee4-94e0-64ab1e19ad9e for instance with task_state deleting. {{(pid=71428) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 23 03:48:07 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Instance destroyed successfully. Apr 23 03:48:07 user nova-compute[71428]: DEBUG nova.objects.instance [None req-5f6b52b7-298c-452e-a7f4-ae50dd891271 tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Lazy-loading 'resources' on Instance uuid 56d8da41-3e04-465b-a1de-73d9e994682d {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:48:07 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-5f6b52b7-298c-452e-a7f4-ae50dd891271 tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:46:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1138104',display_name='tempest-DeleteServersTestJSON-server-1138104',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-deleteserverstestjson-server-1138104',id=1,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-23T03:46:35Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='c4f91ccdb4da4cb4bc4b55b7ac2189f0',ramdisk_id='',reservation_id='r-kou1b0nk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-DeleteServersTestJSON-2062396414',owner_user_name='tempest-DeleteServersTestJSON-2062396414-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-23T03:46:35Z,user_data=None,user_id='4e6ddab9797c449d85bf6f3a241473e0',uuid=56d8da41-3e04-465b-a1de-73d9e994682d,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "552bc11e-f01b-4ee4-94e0-64ab1e19ad9e", "address": "fa:16:3e:c8:1e:d4", "network": {"id": "47db66e9-5162-4031-9d71-5e13c16ee002", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1314718942-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c4f91ccdb4da4cb4bc4b55b7ac2189f0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap552bc11e-f0", "ovs_interfaceid": "552bc11e-f01b-4ee4-94e0-64ab1e19ad9e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 23 03:48:07 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-5f6b52b7-298c-452e-a7f4-ae50dd891271 tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Converting VIF {"id": "552bc11e-f01b-4ee4-94e0-64ab1e19ad9e", "address": "fa:16:3e:c8:1e:d4", "network": {"id": "47db66e9-5162-4031-9d71-5e13c16ee002", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1314718942-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c4f91ccdb4da4cb4bc4b55b7ac2189f0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap552bc11e-f0", "ovs_interfaceid": "552bc11e-f01b-4ee4-94e0-64ab1e19ad9e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:48:07 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-5f6b52b7-298c-452e-a7f4-ae50dd891271 tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c8:1e:d4,bridge_name='br-int',has_traffic_filtering=True,id=552bc11e-f01b-4ee4-94e0-64ab1e19ad9e,network=Network(47db66e9-5162-4031-9d71-5e13c16ee002),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap552bc11e-f0') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:48:07 user nova-compute[71428]: DEBUG os_vif [None req-5f6b52b7-298c-452e-a7f4-ae50dd891271 tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c8:1e:d4,bridge_name='br-int',has_traffic_filtering=True,id=552bc11e-f01b-4ee4-94e0-64ab1e19ad9e,network=Network(47db66e9-5162-4031-9d71-5e13c16ee002),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap552bc11e-f0') {{(pid=71428) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 23 03:48:07 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:07 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap552bc11e-f0, bridge=br-int, if_exists=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:48:07 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:07 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:48:07 user nova-compute[71428]: INFO os_vif [None req-5f6b52b7-298c-452e-a7f4-ae50dd891271 tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c8:1e:d4,bridge_name='br-int',has_traffic_filtering=True,id=552bc11e-f01b-4ee4-94e0-64ab1e19ad9e,network=Network(47db66e9-5162-4031-9d71-5e13c16ee002),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap552bc11e-f0') Apr 23 03:48:07 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-5f6b52b7-298c-452e-a7f4-ae50dd891271 tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Deleting instance files /opt/stack/data/nova/instances/56d8da41-3e04-465b-a1de-73d9e994682d_del Apr 23 03:48:07 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-5f6b52b7-298c-452e-a7f4-ae50dd891271 tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Deletion of /opt/stack/data/nova/instances/56d8da41-3e04-465b-a1de-73d9e994682d_del complete Apr 23 03:48:07 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-5f6b52b7-298c-452e-a7f4-ae50dd891271 tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Checking UEFI support for host arch (x86_64) {{(pid=71428) supports_uefi /opt/stack/nova/nova/virt/libvirt/host.py:1722}} Apr 23 03:48:07 user nova-compute[71428]: INFO nova.virt.libvirt.host [None req-5f6b52b7-298c-452e-a7f4-ae50dd891271 tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] UEFI support detected Apr 23 03:48:07 user nova-compute[71428]: INFO nova.compute.manager [None req-5f6b52b7-298c-452e-a7f4-ae50dd891271 tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Took 0.71 seconds to destroy the instance on the hypervisor. Apr 23 03:48:07 user nova-compute[71428]: DEBUG oslo.service.loopingcall [None req-5f6b52b7-298c-452e-a7f4-ae50dd891271 tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71428) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 23 03:48:07 user nova-compute[71428]: DEBUG nova.compute.manager [-] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Deallocating network for instance {{(pid=71428) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 23 03:48:07 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] deallocate_for_instance() {{(pid=71428) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 23 03:48:07 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:08 user nova-compute[71428]: DEBUG nova.compute.manager [None req-d5c0ab00-e3cf-4c6c-97dd-6e446669e3bc tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:48:08 user nova-compute[71428]: INFO nova.compute.manager [None req-d5c0ab00-e3cf-4c6c-97dd-6e446669e3bc tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] instance snapshotting Apr 23 03:48:08 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-d5c0ab00-e3cf-4c6c-97dd-6e446669e3bc tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Beginning live snapshot process Apr 23 03:48:08 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Updating instance_info_cache with network_info: [] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:48:08 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Took 1.14 seconds to deallocate network for instance. Apr 23 03:48:08 user nova-compute[71428]: DEBUG nova.compute.manager [req-1e6967a9-84b7-49ee-8e9d-858c9188c24d req-c65f068b-8d1e-4285-913b-16f4cd851c29 service nova] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Received event network-vif-deleted-552bc11e-f01b-4ee4-94e0-64ab1e19ad9e {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:48:08 user nova-compute[71428]: INFO nova.compute.manager [req-1e6967a9-84b7-49ee-8e9d-858c9188c24d req-c65f068b-8d1e-4285-913b-16f4cd851c29 service nova] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Neutron deleted interface 552bc11e-f01b-4ee4-94e0-64ab1e19ad9e; detaching it from the instance and deleting it from the info cache Apr 23 03:48:08 user nova-compute[71428]: DEBUG nova.network.neutron [req-1e6967a9-84b7-49ee-8e9d-858c9188c24d req-c65f068b-8d1e-4285-913b-16f4cd851c29 service nova] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Updating instance_info_cache with network_info: [] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:48:08 user nova-compute[71428]: DEBUG nova.compute.manager [req-1e6967a9-84b7-49ee-8e9d-858c9188c24d req-c65f068b-8d1e-4285-913b-16f4cd851c29 service nova] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Detach interface failed, port_id=552bc11e-f01b-4ee4-94e0-64ab1e19ad9e, reason: Instance 56d8da41-3e04-465b-a1de-73d9e994682d could not be found. {{(pid=71428) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 23 03:48:08 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-5f6b52b7-298c-452e-a7f4-ae50dd891271 tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:48:08 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-5f6b52b7-298c-452e-a7f4-ae50dd891271 tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:48:08 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-d5c0ab00-e3cf-4c6c-97dd-6e446669e3bc tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f279f3d3-581d-4d6f-924f-4104ec23832a/disk --force-share --output=json -f qcow2 {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:48:08 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:08 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:48:08 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:48:08 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71428) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 23 03:48:08 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:48:08 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-d5c0ab00-e3cf-4c6c-97dd-6e446669e3bc tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f279f3d3-581d-4d6f-924f-4104ec23832a/disk --force-share --output=json -f qcow2" returned: 0 in 0.148s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:48:08 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-d5c0ab00-e3cf-4c6c-97dd-6e446669e3bc tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f279f3d3-581d-4d6f-924f-4104ec23832a/disk --force-share --output=json -f qcow2 {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:48:08 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:48:08 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-5f6b52b7-298c-452e-a7f4-ae50dd891271 tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:48:08 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-d5c0ab00-e3cf-4c6c-97dd-6e446669e3bc tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f279f3d3-581d-4d6f-924f-4104ec23832a/disk --force-share --output=json -f qcow2" returned: 0 in 0.148s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:48:08 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-d5c0ab00-e3cf-4c6c-97dd-6e446669e3bc tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:48:08 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-5f6b52b7-298c-452e-a7f4-ae50dd891271 tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:48:08 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-5f6b52b7-298c-452e-a7f4-ae50dd891271 tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.410s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:48:08 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.183s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:48:08 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:48:08 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Auditing locally available compute resources for user (node: user) {{(pid=71428) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 23 03:48:08 user nova-compute[71428]: INFO nova.scheduler.client.report [None req-5f6b52b7-298c-452e-a7f4-ae50dd891271 tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Deleted allocations for instance 56d8da41-3e04-465b-a1de-73d9e994682d Apr 23 03:48:08 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-d5c0ab00-e3cf-4c6c-97dd-6e446669e3bc tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.146s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:48:08 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-d5c0ab00-e3cf-4c6c-97dd-6e446669e3bc tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmpjcu5516o/bdc12284ca594f0c81bd90a2996919b3.delta 1073741824 {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:48:09 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-d5c0ab00-e3cf-4c6c-97dd-6e446669e3bc tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmpjcu5516o/bdc12284ca594f0c81bd90a2996919b3.delta 1073741824" returned: 0 in 0.050s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:48:09 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-d5c0ab00-e3cf-4c6c-97dd-6e446669e3bc tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Quiescing instance not available: QEMU guest agent is not enabled. Apr 23 03:48:09 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-5f6b52b7-298c-452e-a7f4-ae50dd891271 tempest-DeleteServersTestJSON-2062396414 tempest-DeleteServersTestJSON-2062396414-project-member] Lock "56d8da41-3e04-465b-a1de-73d9e994682d" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.500s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:48:09 user nova-compute[71428]: DEBUG nova.compute.manager [req-a99e8c8f-c6e8-40f3-a14d-ebadae33b952 req-acb75be2-c89c-4101-8ae6-49e3ff8ced46 service nova] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Received event network-changed-680802e3-0304-496f-927b-855e4167272b {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:48:09 user nova-compute[71428]: DEBUG nova.compute.manager [req-a99e8c8f-c6e8-40f3-a14d-ebadae33b952 req-acb75be2-c89c-4101-8ae6-49e3ff8ced46 service nova] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Refreshing instance network info cache due to event network-changed-680802e3-0304-496f-927b-855e4167272b. {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 23 03:48:09 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-a99e8c8f-c6e8-40f3-a14d-ebadae33b952 req-acb75be2-c89c-4101-8ae6-49e3ff8ced46 service nova] Acquiring lock "refresh_cache-f279f3d3-581d-4d6f-924f-4104ec23832a" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:48:09 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-a99e8c8f-c6e8-40f3-a14d-ebadae33b952 req-acb75be2-c89c-4101-8ae6-49e3ff8ced46 service nova] Acquired lock "refresh_cache-f279f3d3-581d-4d6f-924f-4104ec23832a" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:48:09 user nova-compute[71428]: DEBUG nova.network.neutron [req-a99e8c8f-c6e8-40f3-a14d-ebadae33b952 req-acb75be2-c89c-4101-8ae6-49e3ff8ced46 service nova] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Refreshing network info cache for port 680802e3-0304-496f-927b-855e4167272b {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 23 03:48:09 user nova-compute[71428]: DEBUG nova.network.neutron [req-a99e8c8f-c6e8-40f3-a14d-ebadae33b952 req-acb75be2-c89c-4101-8ae6-49e3ff8ced46 service nova] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Updated VIF entry in instance network info cache for port 680802e3-0304-496f-927b-855e4167272b. {{(pid=71428) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 23 03:48:09 user nova-compute[71428]: DEBUG nova.network.neutron [req-a99e8c8f-c6e8-40f3-a14d-ebadae33b952 req-acb75be2-c89c-4101-8ae6-49e3ff8ced46 service nova] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Updating instance_info_cache with network_info: [{"id": "680802e3-0304-496f-927b-855e4167272b", "address": "fa:16:3e:45:76:bf", "network": {"id": "2354786d-cb38-48f3-9585-d88f27219024", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-812722065-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.29", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5864626fffa7443c800b244471966d65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap680802e3-03", "ovs_interfaceid": "680802e3-0304-496f-927b-855e4167272b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:48:09 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-a99e8c8f-c6e8-40f3-a14d-ebadae33b952 req-acb75be2-c89c-4101-8ae6-49e3ff8ced46 service nova] Releasing lock "refresh_cache-f279f3d3-581d-4d6f-924f-4104ec23832a" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:48:09 user nova-compute[71428]: DEBUG nova.compute.manager [req-a99e8c8f-c6e8-40f3-a14d-ebadae33b952 req-acb75be2-c89c-4101-8ae6-49e3ff8ced46 service nova] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Received event network-vif-plugged-552bc11e-f01b-4ee4-94e0-64ab1e19ad9e {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:48:09 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-a99e8c8f-c6e8-40f3-a14d-ebadae33b952 req-acb75be2-c89c-4101-8ae6-49e3ff8ced46 service nova] Acquiring lock "56d8da41-3e04-465b-a1de-73d9e994682d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:48:09 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-a99e8c8f-c6e8-40f3-a14d-ebadae33b952 req-acb75be2-c89c-4101-8ae6-49e3ff8ced46 service nova] Lock "56d8da41-3e04-465b-a1de-73d9e994682d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:48:09 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-a99e8c8f-c6e8-40f3-a14d-ebadae33b952 req-acb75be2-c89c-4101-8ae6-49e3ff8ced46 service nova] Lock "56d8da41-3e04-465b-a1de-73d9e994682d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:48:09 user nova-compute[71428]: DEBUG nova.compute.manager [req-a99e8c8f-c6e8-40f3-a14d-ebadae33b952 req-acb75be2-c89c-4101-8ae6-49e3ff8ced46 service nova] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] No waiting events found dispatching network-vif-plugged-552bc11e-f01b-4ee4-94e0-64ab1e19ad9e {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:48:09 user nova-compute[71428]: WARNING nova.compute.manager [req-a99e8c8f-c6e8-40f3-a14d-ebadae33b952 req-acb75be2-c89c-4101-8ae6-49e3ff8ced46 service nova] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Received unexpected event network-vif-plugged-552bc11e-f01b-4ee4-94e0-64ab1e19ad9e for instance with vm_state deleted and task_state None. Apr 23 03:48:09 user nova-compute[71428]: DEBUG nova.virt.libvirt.guest [None req-d5c0ab00-e3cf-4c6c-97dd-6e446669e3bc tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] COPY block job progress, current cursor: 0 final cursor: 43778048 {{(pid=71428) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 23 03:48:09 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3ec36a95-88ce-4ed1-9726-7e6d98674dec/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:48:09 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3ec36a95-88ce-4ed1-9726-7e6d98674dec/disk --force-share --output=json" returned: 0 in 0.223s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:48:09 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3ec36a95-88ce-4ed1-9726-7e6d98674dec/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:48:10 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3ec36a95-88ce-4ed1-9726-7e6d98674dec/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:48:10 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6f7cf091-3140-4746-bd1c-95c42ea82f6e/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:48:10 user nova-compute[71428]: DEBUG nova.virt.libvirt.guest [None req-d5c0ab00-e3cf-4c6c-97dd-6e446669e3bc tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] COPY block job progress, current cursor: 43778048 final cursor: 43778048 {{(pid=71428) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 23 03:48:10 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-d5c0ab00-e3cf-4c6c-97dd-6e446669e3bc tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Skipping quiescing instance: QEMU guest agent is not enabled. Apr 23 03:48:10 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6f7cf091-3140-4746-bd1c-95c42ea82f6e/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:48:10 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6f7cf091-3140-4746-bd1c-95c42ea82f6e/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:48:10 user nova-compute[71428]: DEBUG nova.privsep.utils [None req-d5c0ab00-e3cf-4c6c-97dd-6e446669e3bc tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71428) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 23 03:48:10 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-d5c0ab00-e3cf-4c6c-97dd-6e446669e3bc tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmpjcu5516o/bdc12284ca594f0c81bd90a2996919b3.delta /opt/stack/data/nova/instances/snapshots/tmpjcu5516o/bdc12284ca594f0c81bd90a2996919b3 {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:48:10 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6f7cf091-3140-4746-bd1c-95c42ea82f6e/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:48:10 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:48:10 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:48:10 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:48:10 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-d5c0ab00-e3cf-4c6c-97dd-6e446669e3bc tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmpjcu5516o/bdc12284ca594f0c81bd90a2996919b3.delta /opt/stack/data/nova/instances/snapshots/tmpjcu5516o/bdc12284ca594f0c81bd90a2996919b3" returned: 0 in 0.267s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:48:10 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-d5c0ab00-e3cf-4c6c-97dd-6e446669e3bc tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Snapshot extracted, beginning image upload Apr 23 03:48:10 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:48:10 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:48:10 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:48:10 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:48:10 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:48:10 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/08d906c5-1698-4c17-8430-c98f10836398/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:48:11 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/08d906c5-1698-4c17-8430-c98f10836398/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:48:11 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/08d906c5-1698-4c17-8430-c98f10836398/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:48:11 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/08d906c5-1698-4c17-8430-c98f10836398/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:48:11 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f279f3d3-581d-4d6f-924f-4104ec23832a/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:48:11 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f279f3d3-581d-4d6f-924f-4104ec23832a/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:48:11 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f279f3d3-581d-4d6f-924f-4104ec23832a/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:48:11 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f279f3d3-581d-4d6f-924f-4104ec23832a/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:48:11 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91756f90-733e-4aa5-9108-d2d8b1d020fe/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:48:11 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91756f90-733e-4aa5-9108-d2d8b1d020fe/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:48:11 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91756f90-733e-4aa5-9108-d2d8b1d020fe/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:48:11 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91756f90-733e-4aa5-9108-d2d8b1d020fe/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:48:12 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:12 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:48:12 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:48:12 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Hypervisor/Node resource view: name=user free_ram=8194MB free_disk=26.222606658935547GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71428) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 23 03:48:12 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:48:12 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:48:12 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance f279f3d3-581d-4d6f-924f-4104ec23832a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:48:12 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 08d906c5-1698-4c17-8430-c98f10836398 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:48:12 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:48:12 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 954f41b2-5c67-41a4-b38b-ebf3ad60cac7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:48:12 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 3ec36a95-88ce-4ed1-9726-7e6d98674dec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:48:12 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 91756f90-733e-4aa5-9108-d2d8b1d020fe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:48:12 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 6f7cf091-3140-4746-bd1c-95c42ea82f6e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:48:12 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Total usable vcpus: 12, total allocated vcpus: 7 {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 23 03:48:12 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Final resource view: name=user phys_ram=16023MB used_ram=1408MB phys_disk=40GB used_disk=7GB total_vcpus=12 used_vcpus=7 pci_stats=[] {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 23 03:48:12 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:48:12 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:48:12 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Compute_service record updated for user:user {{(pid=71428) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 23 03:48:12 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.347s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:48:12 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-d5c0ab00-e3cf-4c6c-97dd-6e446669e3bc tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Snapshot image upload complete Apr 23 03:48:12 user nova-compute[71428]: INFO nova.compute.manager [None req-d5c0ab00-e3cf-4c6c-97dd-6e446669e3bc tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Took 4.52 seconds to snapshot the instance on the hypervisor. Apr 23 03:48:13 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:13 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:48:13 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Starting heal instance info cache {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 23 03:48:13 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "refresh_cache-f279f3d3-581d-4d6f-924f-4104ec23832a" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:48:13 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquired lock "refresh_cache-f279f3d3-581d-4d6f-924f-4104ec23832a" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:48:13 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Forcefully refreshing network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 23 03:48:14 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Updating instance_info_cache with network_info: [{"id": "680802e3-0304-496f-927b-855e4167272b", "address": "fa:16:3e:45:76:bf", "network": {"id": "2354786d-cb38-48f3-9585-d88f27219024", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-812722065-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.29", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5864626fffa7443c800b244471966d65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap680802e3-03", "ovs_interfaceid": "680802e3-0304-496f-927b-855e4167272b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:48:14 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Releasing lock "refresh_cache-f279f3d3-581d-4d6f-924f-4104ec23832a" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:48:14 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Updated the network info_cache for instance {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 23 03:48:14 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:48:14 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:48:14 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:48:14 user nova-compute[71428]: DEBUG nova.compute.manager [req-d0d9aae9-3db4-445f-9295-30dfabb7e304 req-800e7fb1-eaf6-438e-aa04-dff46bb667a9 service nova] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Received event network-changed-71dae390-2e66-4961-8ce7-1b8fff845732 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:48:14 user nova-compute[71428]: DEBUG nova.compute.manager [req-d0d9aae9-3db4-445f-9295-30dfabb7e304 req-800e7fb1-eaf6-438e-aa04-dff46bb667a9 service nova] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Refreshing instance network info cache due to event network-changed-71dae390-2e66-4961-8ce7-1b8fff845732. {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 23 03:48:14 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-d0d9aae9-3db4-445f-9295-30dfabb7e304 req-800e7fb1-eaf6-438e-aa04-dff46bb667a9 service nova] Acquiring lock "refresh_cache-08d906c5-1698-4c17-8430-c98f10836398" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:48:14 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-d0d9aae9-3db4-445f-9295-30dfabb7e304 req-800e7fb1-eaf6-438e-aa04-dff46bb667a9 service nova] Acquired lock "refresh_cache-08d906c5-1698-4c17-8430-c98f10836398" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:48:14 user nova-compute[71428]: DEBUG nova.network.neutron [req-d0d9aae9-3db4-445f-9295-30dfabb7e304 req-800e7fb1-eaf6-438e-aa04-dff46bb667a9 service nova] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Refreshing network info cache for port 71dae390-2e66-4961-8ce7-1b8fff845732 {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 23 03:48:15 user nova-compute[71428]: DEBUG nova.network.neutron [req-d0d9aae9-3db4-445f-9295-30dfabb7e304 req-800e7fb1-eaf6-438e-aa04-dff46bb667a9 service nova] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Updated VIF entry in instance network info cache for port 71dae390-2e66-4961-8ce7-1b8fff845732. {{(pid=71428) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 23 03:48:15 user nova-compute[71428]: DEBUG nova.network.neutron [req-d0d9aae9-3db4-445f-9295-30dfabb7e304 req-800e7fb1-eaf6-438e-aa04-dff46bb667a9 service nova] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Updating instance_info_cache with network_info: [{"id": "71dae390-2e66-4961-8ce7-1b8fff845732", "address": "fa:16:3e:6e:74:5a", "network": {"id": "5604c946-dd68-4517-9c39-b4d33d86a345", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1424127131-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.10", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70b031ddc5c94ca98e7161de03bda4b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap71dae390-2e", "ovs_interfaceid": "71dae390-2e66-4961-8ce7-1b8fff845732", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:48:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-d0d9aae9-3db4-445f-9295-30dfabb7e304 req-800e7fb1-eaf6-438e-aa04-dff46bb667a9 service nova] Releasing lock "refresh_cache-08d906c5-1698-4c17-8430-c98f10836398" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:48:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-d8176489-0d45-4827-babd-28d7db9dc1d3 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Acquiring lock "08d906c5-1698-4c17-8430-c98f10836398" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:48:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-d8176489-0d45-4827-babd-28d7db9dc1d3 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lock "08d906c5-1698-4c17-8430-c98f10836398" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:48:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-d8176489-0d45-4827-babd-28d7db9dc1d3 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Acquiring lock "08d906c5-1698-4c17-8430-c98f10836398-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:48:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-d8176489-0d45-4827-babd-28d7db9dc1d3 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lock "08d906c5-1698-4c17-8430-c98f10836398-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:48:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-d8176489-0d45-4827-babd-28d7db9dc1d3 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lock "08d906c5-1698-4c17-8430-c98f10836398-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:48:16 user nova-compute[71428]: INFO nova.compute.manager [None req-d8176489-0d45-4827-babd-28d7db9dc1d3 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Terminating instance Apr 23 03:48:16 user nova-compute[71428]: DEBUG nova.compute.manager [None req-d8176489-0d45-4827-babd-28d7db9dc1d3 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Start destroying the instance on the hypervisor. {{(pid=71428) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 23 03:48:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:16 user nova-compute[71428]: DEBUG nova.compute.manager [req-6011d725-b988-4944-9df9-4f954d998e8a req-a77c36db-6117-4ce5-8f68-c0397e06c491 service nova] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Received event network-vif-unplugged-71dae390-2e66-4961-8ce7-1b8fff845732 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:48:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-6011d725-b988-4944-9df9-4f954d998e8a req-a77c36db-6117-4ce5-8f68-c0397e06c491 service nova] Acquiring lock "08d906c5-1698-4c17-8430-c98f10836398-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:48:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-6011d725-b988-4944-9df9-4f954d998e8a req-a77c36db-6117-4ce5-8f68-c0397e06c491 service nova] Lock "08d906c5-1698-4c17-8430-c98f10836398-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:48:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-6011d725-b988-4944-9df9-4f954d998e8a req-a77c36db-6117-4ce5-8f68-c0397e06c491 service nova] Lock "08d906c5-1698-4c17-8430-c98f10836398-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:48:16 user nova-compute[71428]: DEBUG nova.compute.manager [req-6011d725-b988-4944-9df9-4f954d998e8a req-a77c36db-6117-4ce5-8f68-c0397e06c491 service nova] [instance: 08d906c5-1698-4c17-8430-c98f10836398] No waiting events found dispatching network-vif-unplugged-71dae390-2e66-4961-8ce7-1b8fff845732 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:48:16 user nova-compute[71428]: DEBUG nova.compute.manager [req-6011d725-b988-4944-9df9-4f954d998e8a req-a77c36db-6117-4ce5-8f68-c0397e06c491 service nova] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Received event network-vif-unplugged-71dae390-2e66-4961-8ce7-1b8fff845732 for instance with task_state deleting. {{(pid=71428) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 23 03:48:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:16 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Instance destroyed successfully. Apr 23 03:48:16 user nova-compute[71428]: DEBUG nova.objects.instance [None req-d8176489-0d45-4827-babd-28d7db9dc1d3 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lazy-loading 'resources' on Instance uuid 08d906c5-1698-4c17-8430-c98f10836398 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:48:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:16 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-d8176489-0d45-4827-babd-28d7db9dc1d3 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:46:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-1420796136',display_name='tempest-AttachVolumeTestJSON-server-1420796136',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-1420796136',id=3,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAlSA3423uMfFY5jWf8qjLn8fKTUZDZCbWqlOeGrb7q4dIPzzhHHma7J2h5uyB7cueX2lnELTDhfWhiWxlTN0oUzC4mH9t+dt+HV9lsJMmuXTTekJoF4InNO5IUJTfbZHQ==',key_name='tempest-keypair-1821880172',keypairs=,launch_index=0,launched_at=2023-04-23T03:46:35Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='70b031ddc5c94ca98e7161de03bda4b7',ramdisk_id='',reservation_id='r-x3wh6pwx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeTestJSON-276721084',owner_user_name='tempest-AttachVolumeTestJSON-276721084-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-23T03:46:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e3d689f1c160478ca83bbff3104d8ec3',uuid=08d906c5-1698-4c17-8430-c98f10836398,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "71dae390-2e66-4961-8ce7-1b8fff845732", "address": "fa:16:3e:6e:74:5a", "network": {"id": "5604c946-dd68-4517-9c39-b4d33d86a345", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1424127131-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.10", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70b031ddc5c94ca98e7161de03bda4b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap71dae390-2e", "ovs_interfaceid": "71dae390-2e66-4961-8ce7-1b8fff845732", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 23 03:48:16 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-d8176489-0d45-4827-babd-28d7db9dc1d3 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Converting VIF {"id": "71dae390-2e66-4961-8ce7-1b8fff845732", "address": "fa:16:3e:6e:74:5a", "network": {"id": "5604c946-dd68-4517-9c39-b4d33d86a345", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1424127131-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.10", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70b031ddc5c94ca98e7161de03bda4b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap71dae390-2e", "ovs_interfaceid": "71dae390-2e66-4961-8ce7-1b8fff845732", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:48:16 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-d8176489-0d45-4827-babd-28d7db9dc1d3 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6e:74:5a,bridge_name='br-int',has_traffic_filtering=True,id=71dae390-2e66-4961-8ce7-1b8fff845732,network=Network(5604c946-dd68-4517-9c39-b4d33d86a345),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71dae390-2e') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:48:16 user nova-compute[71428]: DEBUG os_vif [None req-d8176489-0d45-4827-babd-28d7db9dc1d3 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6e:74:5a,bridge_name='br-int',has_traffic_filtering=True,id=71dae390-2e66-4961-8ce7-1b8fff845732,network=Network(5604c946-dd68-4517-9c39-b4d33d86a345),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71dae390-2e') {{(pid=71428) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 23 03:48:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap71dae390-2e, bridge=br-int, if_exists=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:48:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:48:16 user nova-compute[71428]: INFO os_vif [None req-d8176489-0d45-4827-babd-28d7db9dc1d3 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6e:74:5a,bridge_name='br-int',has_traffic_filtering=True,id=71dae390-2e66-4961-8ce7-1b8fff845732,network=Network(5604c946-dd68-4517-9c39-b4d33d86a345),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap71dae390-2e') Apr 23 03:48:16 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-d8176489-0d45-4827-babd-28d7db9dc1d3 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Deleting instance files /opt/stack/data/nova/instances/08d906c5-1698-4c17-8430-c98f10836398_del Apr 23 03:48:16 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-d8176489-0d45-4827-babd-28d7db9dc1d3 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Deletion of /opt/stack/data/nova/instances/08d906c5-1698-4c17-8430-c98f10836398_del complete Apr 23 03:48:17 user nova-compute[71428]: INFO nova.compute.manager [None req-d8176489-0d45-4827-babd-28d7db9dc1d3 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Took 0.87 seconds to destroy the instance on the hypervisor. Apr 23 03:48:17 user nova-compute[71428]: DEBUG oslo.service.loopingcall [None req-d8176489-0d45-4827-babd-28d7db9dc1d3 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71428) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 23 03:48:17 user nova-compute[71428]: DEBUG nova.compute.manager [-] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Deallocating network for instance {{(pid=71428) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 23 03:48:17 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: 08d906c5-1698-4c17-8430-c98f10836398] deallocate_for_instance() {{(pid=71428) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 23 03:48:17 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:17 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:17 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:17 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:17 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:18 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Updating instance_info_cache with network_info: [] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:48:18 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Took 1.42 seconds to deallocate network for instance. Apr 23 03:48:18 user nova-compute[71428]: DEBUG nova.compute.manager [req-31582cc6-837a-43b7-82ee-4b2ff12c48e3 req-03806919-f5e2-414d-bc47-42c18fd13d91 service nova] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Received event network-vif-deleted-71dae390-2e66-4961-8ce7-1b8fff845732 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:48:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-d8176489-0d45-4827-babd-28d7db9dc1d3 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:48:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-d8176489-0d45-4827-babd-28d7db9dc1d3 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:48:18 user nova-compute[71428]: DEBUG nova.compute.manager [req-3e315da1-825c-4447-8aae-f04361c8865e req-af9f4ffe-0da5-4b50-8a6e-6b75a20d0b4c service nova] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Received event network-vif-plugged-71dae390-2e66-4961-8ce7-1b8fff845732 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:48:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-3e315da1-825c-4447-8aae-f04361c8865e req-af9f4ffe-0da5-4b50-8a6e-6b75a20d0b4c service nova] Acquiring lock "08d906c5-1698-4c17-8430-c98f10836398-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:48:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-3e315da1-825c-4447-8aae-f04361c8865e req-af9f4ffe-0da5-4b50-8a6e-6b75a20d0b4c service nova] Lock "08d906c5-1698-4c17-8430-c98f10836398-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:48:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-3e315da1-825c-4447-8aae-f04361c8865e req-af9f4ffe-0da5-4b50-8a6e-6b75a20d0b4c service nova] Lock "08d906c5-1698-4c17-8430-c98f10836398-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:48:18 user nova-compute[71428]: DEBUG nova.compute.manager [req-3e315da1-825c-4447-8aae-f04361c8865e req-af9f4ffe-0da5-4b50-8a6e-6b75a20d0b4c service nova] [instance: 08d906c5-1698-4c17-8430-c98f10836398] No waiting events found dispatching network-vif-plugged-71dae390-2e66-4961-8ce7-1b8fff845732 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:48:18 user nova-compute[71428]: WARNING nova.compute.manager [req-3e315da1-825c-4447-8aae-f04361c8865e req-af9f4ffe-0da5-4b50-8a6e-6b75a20d0b4c service nova] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Received unexpected event network-vif-plugged-71dae390-2e66-4961-8ce7-1b8fff845732 for instance with vm_state deleted and task_state None. Apr 23 03:48:18 user nova-compute[71428]: DEBUG nova.compute.manager [req-3e315da1-825c-4447-8aae-f04361c8865e req-af9f4ffe-0da5-4b50-8a6e-6b75a20d0b4c service nova] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Received event network-vif-plugged-71dae390-2e66-4961-8ce7-1b8fff845732 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:48:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-3e315da1-825c-4447-8aae-f04361c8865e req-af9f4ffe-0da5-4b50-8a6e-6b75a20d0b4c service nova] Acquiring lock "08d906c5-1698-4c17-8430-c98f10836398-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:48:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-3e315da1-825c-4447-8aae-f04361c8865e req-af9f4ffe-0da5-4b50-8a6e-6b75a20d0b4c service nova] Lock "08d906c5-1698-4c17-8430-c98f10836398-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:48:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-3e315da1-825c-4447-8aae-f04361c8865e req-af9f4ffe-0da5-4b50-8a6e-6b75a20d0b4c service nova] Lock "08d906c5-1698-4c17-8430-c98f10836398-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:48:18 user nova-compute[71428]: DEBUG nova.compute.manager [req-3e315da1-825c-4447-8aae-f04361c8865e req-af9f4ffe-0da5-4b50-8a6e-6b75a20d0b4c service nova] [instance: 08d906c5-1698-4c17-8430-c98f10836398] No waiting events found dispatching network-vif-plugged-71dae390-2e66-4961-8ce7-1b8fff845732 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:48:18 user nova-compute[71428]: WARNING nova.compute.manager [req-3e315da1-825c-4447-8aae-f04361c8865e req-af9f4ffe-0da5-4b50-8a6e-6b75a20d0b4c service nova] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Received unexpected event network-vif-plugged-71dae390-2e66-4961-8ce7-1b8fff845732 for instance with vm_state deleted and task_state None. Apr 23 03:48:18 user nova-compute[71428]: DEBUG nova.compute.manager [req-3e315da1-825c-4447-8aae-f04361c8865e req-af9f4ffe-0da5-4b50-8a6e-6b75a20d0b4c service nova] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Received event network-vif-plugged-71dae390-2e66-4961-8ce7-1b8fff845732 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:48:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-3e315da1-825c-4447-8aae-f04361c8865e req-af9f4ffe-0da5-4b50-8a6e-6b75a20d0b4c service nova] Acquiring lock "08d906c5-1698-4c17-8430-c98f10836398-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:48:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-3e315da1-825c-4447-8aae-f04361c8865e req-af9f4ffe-0da5-4b50-8a6e-6b75a20d0b4c service nova] Lock "08d906c5-1698-4c17-8430-c98f10836398-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:48:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-3e315da1-825c-4447-8aae-f04361c8865e req-af9f4ffe-0da5-4b50-8a6e-6b75a20d0b4c service nova] Lock "08d906c5-1698-4c17-8430-c98f10836398-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:48:18 user nova-compute[71428]: DEBUG nova.compute.manager [req-3e315da1-825c-4447-8aae-f04361c8865e req-af9f4ffe-0da5-4b50-8a6e-6b75a20d0b4c service nova] [instance: 08d906c5-1698-4c17-8430-c98f10836398] No waiting events found dispatching network-vif-plugged-71dae390-2e66-4961-8ce7-1b8fff845732 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:48:18 user nova-compute[71428]: WARNING nova.compute.manager [req-3e315da1-825c-4447-8aae-f04361c8865e req-af9f4ffe-0da5-4b50-8a6e-6b75a20d0b4c service nova] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Received unexpected event network-vif-plugged-71dae390-2e66-4961-8ce7-1b8fff845732 for instance with vm_state deleted and task_state None. Apr 23 03:48:18 user nova-compute[71428]: DEBUG nova.compute.manager [req-3e315da1-825c-4447-8aae-f04361c8865e req-af9f4ffe-0da5-4b50-8a6e-6b75a20d0b4c service nova] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Received event network-vif-unplugged-71dae390-2e66-4961-8ce7-1b8fff845732 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:48:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-3e315da1-825c-4447-8aae-f04361c8865e req-af9f4ffe-0da5-4b50-8a6e-6b75a20d0b4c service nova] Acquiring lock "08d906c5-1698-4c17-8430-c98f10836398-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:48:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-3e315da1-825c-4447-8aae-f04361c8865e req-af9f4ffe-0da5-4b50-8a6e-6b75a20d0b4c service nova] Lock "08d906c5-1698-4c17-8430-c98f10836398-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:48:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-3e315da1-825c-4447-8aae-f04361c8865e req-af9f4ffe-0da5-4b50-8a6e-6b75a20d0b4c service nova] Lock "08d906c5-1698-4c17-8430-c98f10836398-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:48:18 user nova-compute[71428]: DEBUG nova.compute.manager [req-3e315da1-825c-4447-8aae-f04361c8865e req-af9f4ffe-0da5-4b50-8a6e-6b75a20d0b4c service nova] [instance: 08d906c5-1698-4c17-8430-c98f10836398] No waiting events found dispatching network-vif-unplugged-71dae390-2e66-4961-8ce7-1b8fff845732 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:48:18 user nova-compute[71428]: WARNING nova.compute.manager [req-3e315da1-825c-4447-8aae-f04361c8865e req-af9f4ffe-0da5-4b50-8a6e-6b75a20d0b4c service nova] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Received unexpected event network-vif-unplugged-71dae390-2e66-4961-8ce7-1b8fff845732 for instance with vm_state deleted and task_state None. Apr 23 03:48:18 user nova-compute[71428]: DEBUG nova.compute.manager [req-3e315da1-825c-4447-8aae-f04361c8865e req-af9f4ffe-0da5-4b50-8a6e-6b75a20d0b4c service nova] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Received event network-vif-plugged-71dae390-2e66-4961-8ce7-1b8fff845732 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:48:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-3e315da1-825c-4447-8aae-f04361c8865e req-af9f4ffe-0da5-4b50-8a6e-6b75a20d0b4c service nova] Acquiring lock "08d906c5-1698-4c17-8430-c98f10836398-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:48:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-3e315da1-825c-4447-8aae-f04361c8865e req-af9f4ffe-0da5-4b50-8a6e-6b75a20d0b4c service nova] Lock "08d906c5-1698-4c17-8430-c98f10836398-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:48:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-3e315da1-825c-4447-8aae-f04361c8865e req-af9f4ffe-0da5-4b50-8a6e-6b75a20d0b4c service nova] Lock "08d906c5-1698-4c17-8430-c98f10836398-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:48:18 user nova-compute[71428]: DEBUG nova.compute.manager [req-3e315da1-825c-4447-8aae-f04361c8865e req-af9f4ffe-0da5-4b50-8a6e-6b75a20d0b4c service nova] [instance: 08d906c5-1698-4c17-8430-c98f10836398] No waiting events found dispatching network-vif-plugged-71dae390-2e66-4961-8ce7-1b8fff845732 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:48:18 user nova-compute[71428]: WARNING nova.compute.manager [req-3e315da1-825c-4447-8aae-f04361c8865e req-af9f4ffe-0da5-4b50-8a6e-6b75a20d0b4c service nova] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Received unexpected event network-vif-plugged-71dae390-2e66-4961-8ce7-1b8fff845732 for instance with vm_state deleted and task_state None. Apr 23 03:48:18 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:18 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-d8176489-0d45-4827-babd-28d7db9dc1d3 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:48:18 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-d8176489-0d45-4827-babd-28d7db9dc1d3 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:48:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-d8176489-0d45-4827-babd-28d7db9dc1d3 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.247s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:48:18 user nova-compute[71428]: INFO nova.scheduler.client.report [None req-d8176489-0d45-4827-babd-28d7db9dc1d3 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Deleted allocations for instance 08d906c5-1698-4c17-8430-c98f10836398 Apr 23 03:48:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-d8176489-0d45-4827-babd-28d7db9dc1d3 tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lock "08d906c5-1698-4c17-8430-c98f10836398" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.717s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:48:21 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:22 user nova-compute[71428]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:48:22 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] VM Stopped (Lifecycle Event) Apr 23 03:48:22 user nova-compute[71428]: DEBUG nova.compute.manager [None req-0aa87d40-af37-4311-af51-55b0ad47a4c0 None None] [instance: 56d8da41-3e04-465b-a1de-73d9e994682d] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:48:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:26 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:28 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:31 user nova-compute[71428]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:48:31 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: 08d906c5-1698-4c17-8430-c98f10836398] VM Stopped (Lifecycle Event) Apr 23 03:48:31 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e506b506-4037-440a-8e67-23384ea0ddbb None None] [instance: 08d906c5-1698-4c17-8430-c98f10836398] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:48:31 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:36 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:48:36 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:48:36 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71428) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 23 03:48:36 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 03:48:36 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 03:48:36 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:41 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:42 user nova-compute[71428]: DEBUG nova.compute.manager [req-ccf8414c-31c5-4810-9dab-c597417a0c43 req-7ca24881-e088-4b41-8931-eea2e4216e48 service nova] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Received event network-changed-3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:48:42 user nova-compute[71428]: DEBUG nova.compute.manager [req-ccf8414c-31c5-4810-9dab-c597417a0c43 req-7ca24881-e088-4b41-8931-eea2e4216e48 service nova] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Refreshing instance network info cache due to event network-changed-3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a. {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 23 03:48:42 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-ccf8414c-31c5-4810-9dab-c597417a0c43 req-7ca24881-e088-4b41-8931-eea2e4216e48 service nova] Acquiring lock "refresh_cache-ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:48:42 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-ccf8414c-31c5-4810-9dab-c597417a0c43 req-7ca24881-e088-4b41-8931-eea2e4216e48 service nova] Acquired lock "refresh_cache-ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:48:42 user nova-compute[71428]: DEBUG nova.network.neutron [req-ccf8414c-31c5-4810-9dab-c597417a0c43 req-7ca24881-e088-4b41-8931-eea2e4216e48 service nova] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Refreshing network info cache for port 3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 23 03:48:43 user nova-compute[71428]: DEBUG nova.network.neutron [req-ccf8414c-31c5-4810-9dab-c597417a0c43 req-7ca24881-e088-4b41-8931-eea2e4216e48 service nova] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Updated VIF entry in instance network info cache for port 3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a. {{(pid=71428) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 23 03:48:43 user nova-compute[71428]: DEBUG nova.network.neutron [req-ccf8414c-31c5-4810-9dab-c597417a0c43 req-7ca24881-e088-4b41-8931-eea2e4216e48 service nova] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Updating instance_info_cache with network_info: [{"id": "3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a", "address": "fa:16:3e:68:41:17", "network": {"id": "3986d1d8-252f-42bf-99c9-3d8b53a4b69d", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1773818503-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.54", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "0f253a2e878a45d99dd3cbda86c0c6ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ba908e8-68", "ovs_interfaceid": "3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:48:43 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-ccf8414c-31c5-4810-9dab-c597417a0c43 req-7ca24881-e088-4b41-8931-eea2e4216e48 service nova] Releasing lock "refresh_cache-ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:48:43 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:44 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-9cdefdf6-fd7f-48e1-a49f-a74ffd154a76 tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Acquiring lock "ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:48:44 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-9cdefdf6-fd7f-48e1-a49f-a74ffd154a76 tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Lock "ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:48:44 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-9cdefdf6-fd7f-48e1-a49f-a74ffd154a76 tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Acquiring lock "ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:48:44 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-9cdefdf6-fd7f-48e1-a49f-a74ffd154a76 tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Lock "ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:48:44 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-9cdefdf6-fd7f-48e1-a49f-a74ffd154a76 tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Lock "ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:48:44 user nova-compute[71428]: INFO nova.compute.manager [None req-9cdefdf6-fd7f-48e1-a49f-a74ffd154a76 tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Terminating instance Apr 23 03:48:44 user nova-compute[71428]: DEBUG nova.compute.manager [None req-9cdefdf6-fd7f-48e1-a49f-a74ffd154a76 tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Start destroying the instance on the hypervisor. {{(pid=71428) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 23 03:48:44 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:44 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:44 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:44 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:44 user nova-compute[71428]: DEBUG nova.compute.manager [req-96002049-9b9e-4d05-8ddf-1f2d66a869e2 req-5e008153-2053-4f2b-86b6-b05fecd0382e service nova] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Received event network-vif-unplugged-3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:48:44 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-96002049-9b9e-4d05-8ddf-1f2d66a869e2 req-5e008153-2053-4f2b-86b6-b05fecd0382e service nova] Acquiring lock "ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:48:44 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-96002049-9b9e-4d05-8ddf-1f2d66a869e2 req-5e008153-2053-4f2b-86b6-b05fecd0382e service nova] Lock "ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:48:44 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-96002049-9b9e-4d05-8ddf-1f2d66a869e2 req-5e008153-2053-4f2b-86b6-b05fecd0382e service nova] Lock "ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:48:44 user nova-compute[71428]: DEBUG nova.compute.manager [req-96002049-9b9e-4d05-8ddf-1f2d66a869e2 req-5e008153-2053-4f2b-86b6-b05fecd0382e service nova] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] No waiting events found dispatching network-vif-unplugged-3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:48:44 user nova-compute[71428]: DEBUG nova.compute.manager [req-96002049-9b9e-4d05-8ddf-1f2d66a869e2 req-5e008153-2053-4f2b-86b6-b05fecd0382e service nova] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Received event network-vif-unplugged-3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a for instance with task_state deleting. {{(pid=71428) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 23 03:48:45 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Instance destroyed successfully. Apr 23 03:48:45 user nova-compute[71428]: DEBUG nova.objects.instance [None req-9cdefdf6-fd7f-48e1-a49f-a74ffd154a76 tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Lazy-loading 'resources' on Instance uuid ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:48:45 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-9cdefdf6-fd7f-48e1-a49f-a74ffd154a76 tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2023-04-23T03:46:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-1223917650',display_name='tempest-AttachSCSIVolumeTestJSON-server-1223917650',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachscsivolumetestjson-server-1223917650',id=4,image_ref='5cb75aca-167f-494a-9dae-c96fc146c1ab',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGNSlNQDdoc5OwX5MqEuWXOwleRbjHOg+KZZGXRrtYPfsqJpkBX2tsZzDLx+qRW0+E03swcsJdX8Ei+ZrkxMI0G08tG/YU2etmuLBhObrxLGe/QP38Mzd/FO6rnQsefmhQ==',key_name='tempest-keypair-282378258',keypairs=,launch_index=0,launched_at=2023-04-23T03:47:04Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='0f253a2e878a45d99dd3cbda86c0c6ff',ramdisk_id='',reservation_id='r-sr4p0ec0',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5cb75aca-167f-494a-9dae-c96fc146c1ab',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_scsi_model='virtio-scsi',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachSCSIVolumeTestJSON-1319344340',owner_user_name='tempest-AttachSCSIVolumeTestJSON-1319344340-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-23T03:47:05Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fc51570d0e7c4617ac6b8fa428b9660f',uuid=ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a", "address": "fa:16:3e:68:41:17", "network": {"id": "3986d1d8-252f-42bf-99c9-3d8b53a4b69d", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1773818503-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.54", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "0f253a2e878a45d99dd3cbda86c0c6ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ba908e8-68", "ovs_interfaceid": "3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 23 03:48:45 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-9cdefdf6-fd7f-48e1-a49f-a74ffd154a76 tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Converting VIF {"id": "3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a", "address": "fa:16:3e:68:41:17", "network": {"id": "3986d1d8-252f-42bf-99c9-3d8b53a4b69d", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1773818503-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.54", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "0f253a2e878a45d99dd3cbda86c0c6ff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ba908e8-68", "ovs_interfaceid": "3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:48:45 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-9cdefdf6-fd7f-48e1-a49f-a74ffd154a76 tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:68:41:17,bridge_name='br-int',has_traffic_filtering=True,id=3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a,network=Network(3986d1d8-252f-42bf-99c9-3d8b53a4b69d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ba908e8-68') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:48:45 user nova-compute[71428]: DEBUG os_vif [None req-9cdefdf6-fd7f-48e1-a49f-a74ffd154a76 tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:68:41:17,bridge_name='br-int',has_traffic_filtering=True,id=3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a,network=Network(3986d1d8-252f-42bf-99c9-3d8b53a4b69d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ba908e8-68') {{(pid=71428) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 23 03:48:45 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:45 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3ba908e8-68, bridge=br-int, if_exists=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:48:45 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:45 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:48:45 user nova-compute[71428]: INFO os_vif [None req-9cdefdf6-fd7f-48e1-a49f-a74ffd154a76 tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:68:41:17,bridge_name='br-int',has_traffic_filtering=True,id=3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a,network=Network(3986d1d8-252f-42bf-99c9-3d8b53a4b69d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ba908e8-68') Apr 23 03:48:45 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-9cdefdf6-fd7f-48e1-a49f-a74ffd154a76 tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Deleting instance files /opt/stack/data/nova/instances/ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72_del Apr 23 03:48:45 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-9cdefdf6-fd7f-48e1-a49f-a74ffd154a76 tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Deletion of /opt/stack/data/nova/instances/ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72_del complete Apr 23 03:48:45 user nova-compute[71428]: INFO nova.compute.manager [None req-9cdefdf6-fd7f-48e1-a49f-a74ffd154a76 tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Took 0.88 seconds to destroy the instance on the hypervisor. Apr 23 03:48:45 user nova-compute[71428]: DEBUG oslo.service.loopingcall [None req-9cdefdf6-fd7f-48e1-a49f-a74ffd154a76 tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71428) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 23 03:48:45 user nova-compute[71428]: DEBUG nova.compute.manager [-] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Deallocating network for instance {{(pid=71428) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 23 03:48:45 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] deallocate_for_instance() {{(pid=71428) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 23 03:48:46 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Updating instance_info_cache with network_info: [] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:48:46 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Took 0.96 seconds to deallocate network for instance. Apr 23 03:48:46 user nova-compute[71428]: DEBUG nova.compute.manager [req-034bf656-00db-4bb3-bc15-7ae4e9b34232 req-bf087ff5-3d25-4719-8fde-598f18dbeecf service nova] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Received event network-vif-deleted-3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:48:46 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-9cdefdf6-fd7f-48e1-a49f-a74ffd154a76 tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:48:46 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-9cdefdf6-fd7f-48e1-a49f-a74ffd154a76 tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:48:46 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-9cdefdf6-fd7f-48e1-a49f-a74ffd154a76 tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:48:46 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-9cdefdf6-fd7f-48e1-a49f-a74ffd154a76 tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:48:46 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-9cdefdf6-fd7f-48e1-a49f-a74ffd154a76 tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.292s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:48:46 user nova-compute[71428]: INFO nova.scheduler.client.report [None req-9cdefdf6-fd7f-48e1-a49f-a74ffd154a76 tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Deleted allocations for instance ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72 Apr 23 03:48:46 user nova-compute[71428]: DEBUG nova.compute.manager [req-2ab277be-c2f0-4ece-b3ae-37499eb6eae4 req-92685e2d-04b1-4fe0-ab50-f4ace3155b4d service nova] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Received event network-vif-plugged-3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:48:46 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-2ab277be-c2f0-4ece-b3ae-37499eb6eae4 req-92685e2d-04b1-4fe0-ab50-f4ace3155b4d service nova] Acquiring lock "ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:48:46 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-2ab277be-c2f0-4ece-b3ae-37499eb6eae4 req-92685e2d-04b1-4fe0-ab50-f4ace3155b4d service nova] Lock "ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:48:46 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-2ab277be-c2f0-4ece-b3ae-37499eb6eae4 req-92685e2d-04b1-4fe0-ab50-f4ace3155b4d service nova] Lock "ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:48:46 user nova-compute[71428]: DEBUG nova.compute.manager [req-2ab277be-c2f0-4ece-b3ae-37499eb6eae4 req-92685e2d-04b1-4fe0-ab50-f4ace3155b4d service nova] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] No waiting events found dispatching network-vif-plugged-3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:48:46 user nova-compute[71428]: WARNING nova.compute.manager [req-2ab277be-c2f0-4ece-b3ae-37499eb6eae4 req-92685e2d-04b1-4fe0-ab50-f4ace3155b4d service nova] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Received unexpected event network-vif-plugged-3ba908e8-68cc-4bf7-be30-d5f6bbd0ff1a for instance with vm_state deleted and task_state None. Apr 23 03:48:47 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-9cdefdf6-fd7f-48e1-a49f-a74ffd154a76 tempest-AttachSCSIVolumeTestJSON-1319344340 tempest-AttachSCSIVolumeTestJSON-1319344340-project-member] Lock "ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.379s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:48:48 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:50 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:55 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:48:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:48:59 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:00 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:00 user nova-compute[71428]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:49:00 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] VM Stopped (Lifecycle Event) Apr 23 03:49:00 user nova-compute[71428]: DEBUG nova.compute.manager [None req-d6584bef-2683-49f7-b46b-c4475b3d24fd None None] [instance: ba71c9bf-4d96-4fdc-8a55-cbdc75c29e72] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:49:00 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:04 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:05 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:06 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:49:06 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:49:06 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:08 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Acquiring lock "c8480a7c-b5de-4f66-a4f0-08fc679b0dfd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:49:08 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "c8480a7c-b5de-4f66-a4f0-08fc679b0dfd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:49:08 user nova-compute[71428]: DEBUG nova.compute.manager [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Starting instance... {{(pid=71428) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 23 03:49:08 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:49:08 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:49:08 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71428) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 23 03:49:08 user nova-compute[71428]: INFO nova.compute.claims [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Claim successful on node user Apr 23 03:49:08 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:49:08 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:49:08 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.350s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:49:08 user nova-compute[71428]: DEBUG nova.compute.manager [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Start building networks asynchronously for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 23 03:49:08 user nova-compute[71428]: DEBUG nova.compute.manager [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Allocating IP information in the background. {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 23 03:49:08 user nova-compute[71428]: DEBUG nova.network.neutron [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] allocate_for_instance() {{(pid=71428) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 23 03:49:08 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 23 03:49:08 user nova-compute[71428]: DEBUG nova.compute.manager [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Start building block device mappings for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 23 03:49:08 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:49:08 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:08 user nova-compute[71428]: DEBUG nova.compute.manager [req-d9a5d101-d766-425f-8c94-00786ceab719 req-4a39f113-fba7-4965-94e1-ea947038072f service nova] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Received event network-changed-af57258e-69b3-405b-a9f6-8d54c1810960 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:49:08 user nova-compute[71428]: DEBUG nova.compute.manager [req-d9a5d101-d766-425f-8c94-00786ceab719 req-4a39f113-fba7-4965-94e1-ea947038072f service nova] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Refreshing instance network info cache due to event network-changed-af57258e-69b3-405b-a9f6-8d54c1810960. {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 23 03:49:08 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-d9a5d101-d766-425f-8c94-00786ceab719 req-4a39f113-fba7-4965-94e1-ea947038072f service nova] Acquiring lock "refresh_cache-3ec36a95-88ce-4ed1-9726-7e6d98674dec" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:49:08 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-d9a5d101-d766-425f-8c94-00786ceab719 req-4a39f113-fba7-4965-94e1-ea947038072f service nova] Acquired lock "refresh_cache-3ec36a95-88ce-4ed1-9726-7e6d98674dec" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:49:08 user nova-compute[71428]: DEBUG nova.network.neutron [req-d9a5d101-d766-425f-8c94-00786ceab719 req-4a39f113-fba7-4965-94e1-ea947038072f service nova] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Refreshing network info cache for port af57258e-69b3-405b-a9f6-8d54c1810960 {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 23 03:49:08 user nova-compute[71428]: DEBUG nova.policy [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '99495726467944c38620831fc93e2856', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a79e09b1c4ae4cc5ab11c3e56ee4f0d9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71428) authorize /opt/stack/nova/nova/policy.py:203}} Apr 23 03:49:08 user nova-compute[71428]: DEBUG nova.compute.manager [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Start spawning the instance on the hypervisor. {{(pid=71428) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 23 03:49:08 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Creating instance directory {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 23 03:49:08 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Creating image(s) Apr 23 03:49:08 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Acquiring lock "/opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:49:08 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "/opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:49:08 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "/opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:49:08 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:49:08 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.146s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:49:08 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Acquiring lock "935dc3474dddbde110531a31510941caddc0ae83" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:49:08 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:49:08 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:49:08 user nova-compute[71428]: DEBUG nova.compute.manager [req-2d27a2f0-8f1a-4f02-a9f0-dc32289fbd7d req-5010b466-dbdc-4e17-81c3-4c52a62a96e8 service nova] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Received event network-changed-97d945a1-b86e-4a6d-af52-23084b8eb175 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:49:08 user nova-compute[71428]: DEBUG nova.compute.manager [req-2d27a2f0-8f1a-4f02-a9f0-dc32289fbd7d req-5010b466-dbdc-4e17-81c3-4c52a62a96e8 service nova] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Refreshing instance network info cache due to event network-changed-97d945a1-b86e-4a6d-af52-23084b8eb175. {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 23 03:49:08 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-2d27a2f0-8f1a-4f02-a9f0-dc32289fbd7d req-5010b466-dbdc-4e17-81c3-4c52a62a96e8 service nova] Acquiring lock "refresh_cache-954f41b2-5c67-41a4-b38b-ebf3ad60cac7" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:49:08 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-2d27a2f0-8f1a-4f02-a9f0-dc32289fbd7d req-5010b466-dbdc-4e17-81c3-4c52a62a96e8 service nova] Acquired lock "refresh_cache-954f41b2-5c67-41a4-b38b-ebf3ad60cac7" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:49:08 user nova-compute[71428]: DEBUG nova.network.neutron [req-2d27a2f0-8f1a-4f02-a9f0-dc32289fbd7d req-5010b466-dbdc-4e17-81c3-4c52a62a96e8 service nova] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Refreshing network info cache for port 97d945a1-b86e-4a6d-af52-23084b8eb175 {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 23 03:49:09 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.146s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:49:09 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk 1073741824 {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:49:09 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk 1073741824" returned: 0 in 0.053s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:49:09 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.204s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:49:09 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:49:09 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.135s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:49:09 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Checking if we can resize image /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk. size=1073741824 {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 23 03:49:09 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:49:09 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:49:09 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Cannot resize image /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk to a smaller size. {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 23 03:49:09 user nova-compute[71428]: DEBUG nova.objects.instance [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lazy-loading 'migration_context' on Instance uuid c8480a7c-b5de-4f66-a4f0-08fc679b0dfd {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:49:09 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Created local disks {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 23 03:49:09 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Ensure instance console log exists: /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/console.log {{(pid=71428) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 23 03:49:09 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:49:09 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:49:09 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:49:09 user nova-compute[71428]: DEBUG nova.network.neutron [req-2d27a2f0-8f1a-4f02-a9f0-dc32289fbd7d req-5010b466-dbdc-4e17-81c3-4c52a62a96e8 service nova] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Updated VIF entry in instance network info cache for port 97d945a1-b86e-4a6d-af52-23084b8eb175. {{(pid=71428) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 23 03:49:09 user nova-compute[71428]: DEBUG nova.network.neutron [req-2d27a2f0-8f1a-4f02-a9f0-dc32289fbd7d req-5010b466-dbdc-4e17-81c3-4c52a62a96e8 service nova] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Updating instance_info_cache with network_info: [{"id": "97d945a1-b86e-4a6d-af52-23084b8eb175", "address": "fa:16:3e:ee:63:58", "network": {"id": "bc639aed-946f-458c-b08d-ce3037f5507c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-558421264-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6f44fe9ba07e4e2f925d1a8952356e04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap97d945a1-b8", "ovs_interfaceid": "97d945a1-b86e-4a6d-af52-23084b8eb175", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:49:09 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-2d27a2f0-8f1a-4f02-a9f0-dc32289fbd7d req-5010b466-dbdc-4e17-81c3-4c52a62a96e8 service nova] Releasing lock "refresh_cache-954f41b2-5c67-41a4-b38b-ebf3ad60cac7" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:49:09 user nova-compute[71428]: DEBUG nova.network.neutron [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Successfully created port: e7548d21-c865-4648-a8a6-dba748a01e14 {{(pid=71428) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG nova.network.neutron [req-d9a5d101-d766-425f-8c94-00786ceab719 req-4a39f113-fba7-4965-94e1-ea947038072f service nova] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Updated VIF entry in instance network info cache for port af57258e-69b3-405b-a9f6-8d54c1810960. {{(pid=71428) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG nova.network.neutron [req-d9a5d101-d766-425f-8c94-00786ceab719 req-4a39f113-fba7-4965-94e1-ea947038072f service nova] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Updating instance_info_cache with network_info: [{"id": "af57258e-69b3-405b-a9f6-8d54c1810960", "address": "fa:16:3e:b4:62:dc", "network": {"id": "6adb189c-9b46-4da4-a34b-31edf8685498", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-759967402-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.137", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "24fff486a500421397ecb935828582cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf57258e-69", "ovs_interfaceid": "af57258e-69b3-405b-a9f6-8d54c1810960", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-d9a5d101-d766-425f-8c94-00786ceab719 req-4a39f113-fba7-4965-94e1-ea947038072f service nova] Releasing lock "refresh_cache-3ec36a95-88ce-4ed1-9726-7e6d98674dec" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-aeac8a8c-69e5-4ef2-b812-1ef35e6691fe tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Acquiring lock "3ec36a95-88ce-4ed1-9726-7e6d98674dec" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-aeac8a8c-69e5-4ef2-b812-1ef35e6691fe tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "3ec36a95-88ce-4ed1-9726-7e6d98674dec" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-aeac8a8c-69e5-4ef2-b812-1ef35e6691fe tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Acquiring lock "3ec36a95-88ce-4ed1-9726-7e6d98674dec-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-aeac8a8c-69e5-4ef2-b812-1ef35e6691fe tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "3ec36a95-88ce-4ed1-9726-7e6d98674dec-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-aeac8a8c-69e5-4ef2-b812-1ef35e6691fe tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "3ec36a95-88ce-4ed1-9726-7e6d98674dec-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:49:10 user nova-compute[71428]: INFO nova.compute.manager [None req-aeac8a8c-69e5-4ef2-b812-1ef35e6691fe tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Terminating instance Apr 23 03:49:10 user nova-compute[71428]: DEBUG nova.compute.manager [None req-aeac8a8c-69e5-4ef2-b812-1ef35e6691fe tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Start destroying the instance on the hypervisor. {{(pid=71428) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71428) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Auditing locally available compute resources for user (node: user) {{(pid=71428) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG nova.compute.manager [req-6ebb98de-8a20-413b-bad4-e4b3e74b2e16 req-8fb37789-5018-4c0f-bc7a-eaa7a90013e1 service nova] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Received event network-vif-unplugged-af57258e-69b3-405b-a9f6-8d54c1810960 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-6ebb98de-8a20-413b-bad4-e4b3e74b2e16 req-8fb37789-5018-4c0f-bc7a-eaa7a90013e1 service nova] Acquiring lock "3ec36a95-88ce-4ed1-9726-7e6d98674dec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-6ebb98de-8a20-413b-bad4-e4b3e74b2e16 req-8fb37789-5018-4c0f-bc7a-eaa7a90013e1 service nova] Lock "3ec36a95-88ce-4ed1-9726-7e6d98674dec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-6ebb98de-8a20-413b-bad4-e4b3e74b2e16 req-8fb37789-5018-4c0f-bc7a-eaa7a90013e1 service nova] Lock "3ec36a95-88ce-4ed1-9726-7e6d98674dec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG nova.compute.manager [req-6ebb98de-8a20-413b-bad4-e4b3e74b2e16 req-8fb37789-5018-4c0f-bc7a-eaa7a90013e1 service nova] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] No waiting events found dispatching network-vif-unplugged-af57258e-69b3-405b-a9f6-8d54c1810960 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG nova.compute.manager [req-6ebb98de-8a20-413b-bad4-e4b3e74b2e16 req-8fb37789-5018-4c0f-bc7a-eaa7a90013e1 service nova] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Received event network-vif-unplugged-af57258e-69b3-405b-a9f6-8d54c1810960 for instance with task_state deleting. {{(pid=71428) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG nova.network.neutron [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Successfully updated port: e7548d21-c865-4648-a8a6-dba748a01e14 {{(pid=71428) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Acquiring lock "refresh_cache-c8480a7c-b5de-4f66-a4f0-08fc679b0dfd" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Acquired lock "refresh_cache-c8480a7c-b5de-4f66-a4f0-08fc679b0dfd" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG nova.network.neutron [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Building network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG nova.network.neutron [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Instance cache missing network info. {{(pid=71428) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Acquiring lock "6ee9a44e-e6af-4eec-975c-3991146ce71b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lock "6ee9a44e-e6af-4eec-975c-3991146ce71b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG nova.compute.manager [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Starting instance... {{(pid=71428) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 23 03:49:10 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Instance destroyed successfully. Apr 23 03:49:10 user nova-compute[71428]: DEBUG nova.objects.instance [None req-aeac8a8c-69e5-4ef2-b812-1ef35e6691fe tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lazy-loading 'resources' on Instance uuid 3ec36a95-88ce-4ed1-9726-7e6d98674dec {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-aeac8a8c-69e5-4ef2-b812-1ef35e6691fe tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:47:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-8654037',display_name='tempest-AttachVolumeNegativeTest-server-8654037',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-8654037',id=6,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBChmwKzve84HsDc9e1enTU2dyNO3uMaTpP7qkGUnUryRoyEDEy7qYcgSp8YUuV2PVICAmznt0zJV+FBBbxWJ37uTbw3NkoqcDxcyXKpOSwoZ98BfcM/UT5YtXU1qwYpFAw==',key_name='tempest-keypair-2054139165',keypairs=,launch_index=0,launched_at=2023-04-23T03:47:26Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='24fff486a500421397ecb935828582cd',ramdisk_id='',reservation_id='r-i401xtxu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeNegativeTest-636753786',owner_user_name='tempest-AttachVolumeNegativeTest-636753786-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-23T03:47:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0a2c459cad014b07b2613e5e261d88aa',uuid=3ec36a95-88ce-4ed1-9726-7e6d98674dec,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "af57258e-69b3-405b-a9f6-8d54c1810960", "address": "fa:16:3e:b4:62:dc", "network": {"id": "6adb189c-9b46-4da4-a34b-31edf8685498", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-759967402-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.137", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "24fff486a500421397ecb935828582cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf57258e-69", "ovs_interfaceid": "af57258e-69b3-405b-a9f6-8d54c1810960", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-aeac8a8c-69e5-4ef2-b812-1ef35e6691fe tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Converting VIF {"id": "af57258e-69b3-405b-a9f6-8d54c1810960", "address": "fa:16:3e:b4:62:dc", "network": {"id": "6adb189c-9b46-4da4-a34b-31edf8685498", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-759967402-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.137", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "24fff486a500421397ecb935828582cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf57258e-69", "ovs_interfaceid": "af57258e-69b3-405b-a9f6-8d54c1810960", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-aeac8a8c-69e5-4ef2-b812-1ef35e6691fe tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b4:62:dc,bridge_name='br-int',has_traffic_filtering=True,id=af57258e-69b3-405b-a9f6-8d54c1810960,network=Network(6adb189c-9b46-4da4-a34b-31edf8685498),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf57258e-69') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG os_vif [None req-aeac8a8c-69e5-4ef2-b812-1ef35e6691fe tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b4:62:dc,bridge_name='br-int',has_traffic_filtering=True,id=af57258e-69b3-405b-a9f6-8d54c1810960,network=Network(6adb189c-9b46-4da4-a34b-31edf8685498),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf57258e-69') {{(pid=71428) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaf57258e-69, bridge=br-int, if_exists=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:10 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:49:10 user nova-compute[71428]: INFO os_vif [None req-aeac8a8c-69e5-4ef2-b812-1ef35e6691fe tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b4:62:dc,bridge_name='br-int',has_traffic_filtering=True,id=af57258e-69b3-405b-a9f6-8d54c1810960,network=Network(6adb189c-9b46-4da4-a34b-31edf8685498),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf57258e-69') Apr 23 03:49:10 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-aeac8a8c-69e5-4ef2-b812-1ef35e6691fe tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Deleting instance files /opt/stack/data/nova/instances/3ec36a95-88ce-4ed1-9726-7e6d98674dec_del Apr 23 03:49:10 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-aeac8a8c-69e5-4ef2-b812-1ef35e6691fe tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Deletion of /opt/stack/data/nova/instances/3ec36a95-88ce-4ed1-9726-7e6d98674dec_del complete Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.compute.manager [req-81a5e1e3-e308-45fe-94d9-c503ffecc66a req-e0781568-8388-4342-aab8-73a56bfa669f service nova] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Received event network-changed-e7548d21-c865-4648-a8a6-dba748a01e14 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.compute.manager [req-81a5e1e3-e308-45fe-94d9-c503ffecc66a req-e0781568-8388-4342-aab8-73a56bfa669f service nova] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Refreshing instance network info cache due to event network-changed-e7548d21-c865-4648-a8a6-dba748a01e14. {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-81a5e1e3-e308-45fe-94d9-c503ffecc66a req-e0781568-8388-4342-aab8-73a56bfa669f service nova] Acquiring lock "refresh_cache-c8480a7c-b5de-4f66-a4f0-08fc679b0dfd" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71428) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 23 03:49:11 user nova-compute[71428]: INFO nova.compute.claims [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Claim successful on node user Apr 23 03:49:11 user nova-compute[71428]: INFO nova.compute.manager [None req-aeac8a8c-69e5-4ef2-b812-1ef35e6691fe tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Took 0.69 seconds to destroy the instance on the hypervisor. Apr 23 03:49:11 user nova-compute[71428]: DEBUG oslo.service.loopingcall [None req-aeac8a8c-69e5-4ef2-b812-1ef35e6691fe tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71428) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.compute.manager [-] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Deallocating network for instance {{(pid=71428) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] deallocate_for_instance() {{(pid=71428) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 23 03:49:11 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Periodic task is updating the host stats, it is trying to get disk info for instance-00000006, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /opt/stack/data/nova/instances/3ec36a95-88ce-4ed1-9726-7e6d98674dec/disk: nova.exception.DiskNotFound: No disk at /opt/stack/data/nova/instances/3ec36a95-88ce-4ed1-9726-7e6d98674dec/disk Apr 23 03:49:11 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6f7cf091-3140-4746-bd1c-95c42ea82f6e/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.network.neutron [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Updating instance_info_cache with network_info: [{"id": "e7548d21-c865-4648-a8a6-dba748a01e14", "address": "fa:16:3e:08:69:16", "network": {"id": "25de9e6c-4c02-4658-b076-466681d96605", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1644366708-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a79e09b1c4ae4cc5ab11c3e56ee4f0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape7548d21-c8", "ovs_interfaceid": "e7548d21-c865-4648-a8a6-dba748a01e14", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Releasing lock "refresh_cache-c8480a7c-b5de-4f66-a4f0-08fc679b0dfd" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.compute.manager [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Instance network_info: |[{"id": "e7548d21-c865-4648-a8a6-dba748a01e14", "address": "fa:16:3e:08:69:16", "network": {"id": "25de9e6c-4c02-4658-b076-466681d96605", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1644366708-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a79e09b1c4ae4cc5ab11c3e56ee4f0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape7548d21-c8", "ovs_interfaceid": "e7548d21-c865-4648-a8a6-dba748a01e14", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-81a5e1e3-e308-45fe-94d9-c503ffecc66a req-e0781568-8388-4342-aab8-73a56bfa669f service nova] Acquired lock "refresh_cache-c8480a7c-b5de-4f66-a4f0-08fc679b0dfd" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.network.neutron [req-81a5e1e3-e308-45fe-94d9-c503ffecc66a req-e0781568-8388-4342-aab8-73a56bfa669f service nova] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Refreshing network info cache for port e7548d21-c865-4648-a8a6-dba748a01e14 {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Start _get_guest_xml network_info=[{"id": "e7548d21-c865-4648-a8a6-dba748a01e14", "address": "fa:16:3e:08:69:16", "network": {"id": "25de9e6c-4c02-4658-b076-466681d96605", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1644366708-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a79e09b1c4ae4cc5ab11c3e56ee4f0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape7548d21-c8", "ovs_interfaceid": "e7548d21-c865-4648-a8a6-dba748a01e14", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'image_id': 'e6127373-9931-4277-9458-eceef653ea1e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 23 03:49:11 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:49:11 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71428) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-23T03:43:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=), allow threads: True {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Flavor limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Image limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Flavor pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Image pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Got 1 possible topologies {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:49:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-2136327616',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-2136327616',id=9,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a79e09b1c4ae4cc5ab11c3e56ee4f0d9',ramdisk_id='',reservation_id='r-uc8k6aib',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-653032906',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:49:09Z,user_data=None,user_id='99495726467944c38620831fc93e2856',uuid=c8480a7c-b5de-4f66-a4f0-08fc679b0dfd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e7548d21-c865-4648-a8a6-dba748a01e14", "address": "fa:16:3e:08:69:16", "network": {"id": "25de9e6c-4c02-4658-b076-466681d96605", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1644366708-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a79e09b1c4ae4cc5ab11c3e56ee4f0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape7548d21-c8", "ovs_interfaceid": "e7548d21-c865-4648-a8a6-dba748a01e14", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71428) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Converting VIF {"id": "e7548d21-c865-4648-a8a6-dba748a01e14", "address": "fa:16:3e:08:69:16", "network": {"id": "25de9e6c-4c02-4658-b076-466681d96605", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1644366708-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a79e09b1c4ae4cc5ab11c3e56ee4f0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape7548d21-c8", "ovs_interfaceid": "e7548d21-c865-4648-a8a6-dba748a01e14", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:08:69:16,bridge_name='br-int',has_traffic_filtering=True,id=e7548d21-c865-4648-a8a6-dba748a01e14,network=Network(25de9e6c-4c02-4658-b076-466681d96605),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7548d21-c8') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.objects.instance [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lazy-loading 'pci_devices' on Instance uuid c8480a7c-b5de-4f66-a4f0-08fc679b0dfd {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6f7cf091-3140-4746-bd1c-95c42ea82f6e/disk --force-share --output=json" returned: 0 in 0.169s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6f7cf091-3140-4746-bd1c-95c42ea82f6e/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] End _get_guest_xml xml= Apr 23 03:49:11 user nova-compute[71428]: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd Apr 23 03:49:11 user nova-compute[71428]: instance-00000009 Apr 23 03:49:11 user nova-compute[71428]: 131072 Apr 23 03:49:11 user nova-compute[71428]: 1 Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: tempest-ServerBootFromVolumeStableRescueTest-server-2136327616 Apr 23 03:49:11 user nova-compute[71428]: 2023-04-23 03:49:11 Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: 128 Apr 23 03:49:11 user nova-compute[71428]: 1 Apr 23 03:49:11 user nova-compute[71428]: 0 Apr 23 03:49:11 user nova-compute[71428]: 0 Apr 23 03:49:11 user nova-compute[71428]: 1 Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member Apr 23 03:49:11 user nova-compute[71428]: tempest-ServerBootFromVolumeStableRescueTest-653032906 Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: OpenStack Foundation Apr 23 03:49:11 user nova-compute[71428]: OpenStack Nova Apr 23 03:49:11 user nova-compute[71428]: 0.0.0 Apr 23 03:49:11 user nova-compute[71428]: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd Apr 23 03:49:11 user nova-compute[71428]: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd Apr 23 03:49:11 user nova-compute[71428]: Virtual Machine Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: hvm Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Nehalem Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: /dev/urandom Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: Apr 23 03:49:11 user nova-compute[71428]: {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:49:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-2136327616',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-2136327616',id=9,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a79e09b1c4ae4cc5ab11c3e56ee4f0d9',ramdisk_id='',reservation_id='r-uc8k6aib',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-653032906',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:49:09Z,user_data=None,user_id='99495726467944c38620831fc93e2856',uuid=c8480a7c-b5de-4f66-a4f0-08fc679b0dfd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e7548d21-c865-4648-a8a6-dba748a01e14", "address": "fa:16:3e:08:69:16", "network": {"id": "25de9e6c-4c02-4658-b076-466681d96605", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1644366708-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a79e09b1c4ae4cc5ab11c3e56ee4f0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape7548d21-c8", "ovs_interfaceid": "e7548d21-c865-4648-a8a6-dba748a01e14", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Converting VIF {"id": "e7548d21-c865-4648-a8a6-dba748a01e14", "address": "fa:16:3e:08:69:16", "network": {"id": "25de9e6c-4c02-4658-b076-466681d96605", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1644366708-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a79e09b1c4ae4cc5ab11c3e56ee4f0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape7548d21-c8", "ovs_interfaceid": "e7548d21-c865-4648-a8a6-dba748a01e14", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:08:69:16,bridge_name='br-int',has_traffic_filtering=True,id=e7548d21-c865-4648-a8a6-dba748a01e14,network=Network(25de9e6c-4c02-4658-b076-466681d96605),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7548d21-c8') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG os_vif [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:08:69:16,bridge_name='br-int',has_traffic_filtering=True,id=e7548d21-c865-4648-a8a6-dba748a01e14,network=Network(25de9e6c-4c02-4658-b076-466681d96605),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7548d21-c8') {{(pid=71428) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape7548d21-c8, may_exist=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape7548d21-c8, col_values=(('external_ids', {'iface-id': 'e7548d21-c865-4648-a8a6-dba748a01e14', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:08:69:16', 'vm-uuid': 'c8480a7c-b5de-4f66-a4f0-08fc679b0dfd'}),)) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:11 user nova-compute[71428]: INFO os_vif [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:08:69:16,bridge_name='br-int',has_traffic_filtering=True,id=e7548d21-c865-4648-a8a6-dba748a01e14,network=Network(25de9e6c-4c02-4658-b076-466681d96605),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7548d21-c8') Apr 23 03:49:11 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6f7cf091-3140-4746-bd1c-95c42ea82f6e/disk --force-share --output=json" returned: 0 in 0.150s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] No BDM found with device name vda, not building metadata. {{(pid=71428) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] No VIF found with MAC fa:16:3e:08:69:16, not building metadata {{(pid=71428) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/disk --force-share --output=json" returned: 0 in 0.144s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.579s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.compute.manager [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Start building networks asynchronously for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.compute.manager [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Allocating IP information in the background. {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.network.neutron [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] allocate_for_instance() {{(pid=71428) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 23 03:49:11 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 23 03:49:11 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/disk --force-share --output=json" returned: 0 in 0.148s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.compute.manager [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Start building block device mappings for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f279f3d3-581d-4d6f-924f-4104ec23832a/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.compute.manager [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Start spawning the instance on the hypervisor. {{(pid=71428) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Creating instance directory {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 23 03:49:11 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Creating image(s) Apr 23 03:49:11 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Acquiring lock "/opt/stack/data/nova/instances/6ee9a44e-e6af-4eec-975c-3991146ce71b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lock "/opt/stack/data/nova/instances/6ee9a44e-e6af-4eec-975c-3991146ce71b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lock "/opt/stack/data/nova/instances/6ee9a44e-e6af-4eec-975c-3991146ce71b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Updating instance_info_cache with network_info: [] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f279f3d3-581d-4d6f-924f-4104ec23832a/disk --force-share --output=json" returned: 0 in 0.199s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:49:11 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f279f3d3-581d-4d6f-924f-4104ec23832a/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:49:11 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Took 0.90 seconds to deallocate network for instance. Apr 23 03:49:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.134s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:49:12 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Acquiring lock "935dc3474dddbde110531a31510941caddc0ae83" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:49:12 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:49:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:49:12 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-aeac8a8c-69e5-4ef2-b812-1ef35e6691fe tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:49:12 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-aeac8a8c-69e5-4ef2-b812-1ef35e6691fe tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.002s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:49:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f279f3d3-581d-4d6f-924f-4104ec23832a/disk --force-share --output=json" returned: 0 in 0.163s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:49:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91756f90-733e-4aa5-9108-d2d8b1d020fe/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:49:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.138s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:49:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/6ee9a44e-e6af-4eec-975c-3991146ce71b/disk 1073741824 {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:49:12 user nova-compute[71428]: DEBUG nova.policy [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e3d689f1c160478ca83bbff3104d8ec3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '70b031ddc5c94ca98e7161de03bda4b7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71428) authorize /opt/stack/nova/nova/policy.py:203}} Apr 23 03:49:12 user nova-compute[71428]: DEBUG nova.network.neutron [req-81a5e1e3-e308-45fe-94d9-c503ffecc66a req-e0781568-8388-4342-aab8-73a56bfa669f service nova] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Updated VIF entry in instance network info cache for port e7548d21-c865-4648-a8a6-dba748a01e14. {{(pid=71428) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 23 03:49:12 user nova-compute[71428]: DEBUG nova.network.neutron [req-81a5e1e3-e308-45fe-94d9-c503ffecc66a req-e0781568-8388-4342-aab8-73a56bfa669f service nova] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Updating instance_info_cache with network_info: [{"id": "e7548d21-c865-4648-a8a6-dba748a01e14", "address": "fa:16:3e:08:69:16", "network": {"id": "25de9e6c-4c02-4658-b076-466681d96605", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1644366708-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a79e09b1c4ae4cc5ab11c3e56ee4f0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape7548d21-c8", "ovs_interfaceid": "e7548d21-c865-4648-a8a6-dba748a01e14", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:49:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/6ee9a44e-e6af-4eec-975c-3991146ce71b/disk 1073741824" returned: 0 in 0.068s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:49:12 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.210s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:49:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:49:12 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-81a5e1e3-e308-45fe-94d9-c503ffecc66a req-e0781568-8388-4342-aab8-73a56bfa669f service nova] Releasing lock "refresh_cache-c8480a7c-b5de-4f66-a4f0-08fc679b0dfd" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:49:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91756f90-733e-4aa5-9108-d2d8b1d020fe/disk --force-share --output=json" returned: 0 in 0.178s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:49:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91756f90-733e-4aa5-9108-d2d8b1d020fe/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:49:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.140s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:49:12 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Checking if we can resize image /opt/stack/data/nova/instances/6ee9a44e-e6af-4eec-975c-3991146ce71b/disk. size=1073741824 {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 23 03:49:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6ee9a44e-e6af-4eec-975c-3991146ce71b/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:49:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91756f90-733e-4aa5-9108-d2d8b1d020fe/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:49:14 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Resumed> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:49:15 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] VM Resumed (Lifecycle Event) Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.network.neutron [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Successfully created port: aaf8a702-e47a-4fc7-a9a8-3a7e63525294 {{(pid=71428) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.compute.manager [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Instance event wait completed in 0 seconds for {{(pid=71428) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Guest created on hypervisor {{(pid=71428) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.compute.manager [req-73be63ee-aba6-46cb-8df3-7e567912a91e req-1604a95c-da6a-4f40-aa49-974b22c85e19 service nova] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Received event network-vif-plugged-af57258e-69b3-405b-a9f6-8d54c1810960 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-73be63ee-aba6-46cb-8df3-7e567912a91e req-1604a95c-da6a-4f40-aa49-974b22c85e19 service nova] Acquiring lock "3ec36a95-88ce-4ed1-9726-7e6d98674dec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-73be63ee-aba6-46cb-8df3-7e567912a91e req-1604a95c-da6a-4f40-aa49-974b22c85e19 service nova] Lock "3ec36a95-88ce-4ed1-9726-7e6d98674dec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-73be63ee-aba6-46cb-8df3-7e567912a91e req-1604a95c-da6a-4f40-aa49-974b22c85e19 service nova] Lock "3ec36a95-88ce-4ed1-9726-7e6d98674dec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.compute.manager [req-73be63ee-aba6-46cb-8df3-7e567912a91e req-1604a95c-da6a-4f40-aa49-974b22c85e19 service nova] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] No waiting events found dispatching network-vif-plugged-af57258e-69b3-405b-a9f6-8d54c1810960 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:49:15 user nova-compute[71428]: WARNING nova.compute.manager [req-73be63ee-aba6-46cb-8df3-7e567912a91e req-1604a95c-da6a-4f40-aa49-974b22c85e19 service nova] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Received unexpected event network-vif-plugged-af57258e-69b3-405b-a9f6-8d54c1810960 for instance with vm_state deleted and task_state None. Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.compute.manager [req-73be63ee-aba6-46cb-8df3-7e567912a91e req-1604a95c-da6a-4f40-aa49-974b22c85e19 service nova] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Received event network-vif-deleted-af57258e-69b3-405b-a9f6-8d54c1810960 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.compute.manager [req-6179646f-122a-4fa6-8c11-51f6c9b3f606 req-de762340-a5f0-4c0f-b2dd-08b10db113d0 service nova] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Received event network-vif-plugged-e7548d21-c865-4648-a8a6-dba748a01e14 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-6179646f-122a-4fa6-8c11-51f6c9b3f606 req-de762340-a5f0-4c0f-b2dd-08b10db113d0 service nova] Acquiring lock "c8480a7c-b5de-4f66-a4f0-08fc679b0dfd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-6179646f-122a-4fa6-8c11-51f6c9b3f606 req-de762340-a5f0-4c0f-b2dd-08b10db113d0 service nova] Lock "c8480a7c-b5de-4f66-a4f0-08fc679b0dfd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-6179646f-122a-4fa6-8c11-51f6c9b3f606 req-de762340-a5f0-4c0f-b2dd-08b10db113d0 service nova] Lock "c8480a7c-b5de-4f66-a4f0-08fc679b0dfd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.compute.manager [req-6179646f-122a-4fa6-8c11-51f6c9b3f606 req-de762340-a5f0-4c0f-b2dd-08b10db113d0 service nova] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] No waiting events found dispatching network-vif-plugged-e7548d21-c865-4648-a8a6-dba748a01e14 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:49:15 user nova-compute[71428]: WARNING nova.compute.manager [req-6179646f-122a-4fa6-8c11-51f6c9b3f606 req-de762340-a5f0-4c0f-b2dd-08b10db113d0 service nova] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Received unexpected event network-vif-plugged-e7548d21-c865-4648-a8a6-dba748a01e14 for instance with vm_state building and task_state spawning. Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.compute.manager [req-00c4a25c-660a-4eea-8a7e-b727a37a999c req-5d8f24d6-ab8e-4f71-954b-38c093b89979 service nova] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Received event network-vif-plugged-e7548d21-c865-4648-a8a6-dba748a01e14 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-00c4a25c-660a-4eea-8a7e-b727a37a999c req-5d8f24d6-ab8e-4f71-954b-38c093b89979 service nova] Acquiring lock "c8480a7c-b5de-4f66-a4f0-08fc679b0dfd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-00c4a25c-660a-4eea-8a7e-b727a37a999c req-5d8f24d6-ab8e-4f71-954b-38c093b89979 service nova] Lock "c8480a7c-b5de-4f66-a4f0-08fc679b0dfd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-00c4a25c-660a-4eea-8a7e-b727a37a999c req-5d8f24d6-ab8e-4f71-954b-38c093b89979 service nova] Lock "c8480a7c-b5de-4f66-a4f0-08fc679b0dfd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.compute.manager [req-00c4a25c-660a-4eea-8a7e-b727a37a999c req-5d8f24d6-ab8e-4f71-954b-38c093b89979 service nova] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] No waiting events found dispatching network-vif-plugged-e7548d21-c865-4648-a8a6-dba748a01e14 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:49:15 user nova-compute[71428]: WARNING nova.compute.manager [req-00c4a25c-660a-4eea-8a7e-b727a37a999c req-5d8f24d6-ab8e-4f71-954b-38c093b89979 service nova] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Received unexpected event network-vif-plugged-e7548d21-c865-4648-a8a6-dba748a01e14 for instance with vm_state building and task_state spawning. Apr 23 03:49:15 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6ee9a44e-e6af-4eec-975c-3991146ce71b/disk --force-share --output=json" returned: 0 in 2.655s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Cannot resize image /opt/stack/data/nova/instances/6ee9a44e-e6af-4eec-975c-3991146ce71b/disk to a smaller size. {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.objects.instance [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lazy-loading 'migration_context' on Instance uuid 6ee9a44e-e6af-4eec-975c-3991146ce71b {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-aeac8a8c-69e5-4ef2-b812-1ef35e6691fe tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Created local disks {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Ensure instance console log exists: /opt/stack/data/nova/instances/6ee9a44e-e6af-4eec-975c-3991146ce71b/console.log {{(pid=71428) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-aeac8a8c-69e5-4ef2-b812-1ef35e6691fe tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:49:15 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Instance spawned successfully. Apr 23 03:49:15 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:49:15 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Hypervisor/Node resource view: name=user free_ram=8575MB free_disk=26.20771026611328GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71428) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 23 03:49:15 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Started> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:49:15 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] VM Started (Lifecycle Event) Apr 23 03:49:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-aeac8a8c-69e5-4ef2-b812-1ef35e6691fe tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 3.022s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.026s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Found default for hw_cdrom_bus of ide {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Found default for hw_disk_bus of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Found default for hw_input_bus of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Found default for hw_pointer_model of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Found default for hw_video_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Found default for hw_vif_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:49:15 user nova-compute[71428]: INFO nova.scheduler.client.report [None req-aeac8a8c-69e5-4ef2-b812-1ef35e6691fe tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Deleted allocations for instance 3ec36a95-88ce-4ed1-9726-7e6d98674dec Apr 23 03:49:15 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 03:49:15 user nova-compute[71428]: INFO nova.compute.manager [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Took 6.43 seconds to spawn the instance on the hypervisor. Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.compute.manager [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-aeac8a8c-69e5-4ef2-b812-1ef35e6691fe tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "3ec36a95-88ce-4ed1-9726-7e6d98674dec" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 4.873s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance f279f3d3-581d-4d6f-924f-4104ec23832a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 954f41b2-5c67-41a4-b38b-ebf3ad60cac7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 91756f90-733e-4aa5-9108-d2d8b1d020fe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 6f7cf091-3140-4746-bd1c-95c42ea82f6e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance c8480a7c-b5de-4f66-a4f0-08fc679b0dfd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 6ee9a44e-e6af-4eec-975c-3991146ce71b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Total usable vcpus: 12, total allocated vcpus: 6 {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Final resource view: name=user phys_ram=16023MB used_ram=1280MB phys_disk=40GB used_disk=6GB total_vcpus=12 used_vcpus=6 pci_stats=[] {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 23 03:49:15 user nova-compute[71428]: INFO nova.compute.manager [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Took 7.19 seconds to build instance. Apr 23 03:49:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-2a116d4e-8cf4-4761-aa53-ebe6cc43c9ad tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "c8480a7c-b5de-4f66-a4f0-08fc679b0dfd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.283s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Compute_service record updated for user:user {{(pid=71428) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.379s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.network.neutron [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Successfully updated port: aaf8a702-e47a-4fc7-a9a8-3a7e63525294 {{(pid=71428) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Acquiring lock "refresh_cache-6ee9a44e-e6af-4eec-975c-3991146ce71b" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Acquired lock "refresh_cache-6ee9a44e-e6af-4eec-975c-3991146ce71b" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.network.neutron [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Building network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.network.neutron [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Instance cache missing network info. {{(pid=71428) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.compute.manager [req-c00d7f3c-ff71-4e5d-9399-67fc69730a7a req-cf9e0e51-233f-4a04-baf5-370e7a5c7a02 service nova] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Received event network-changed-aaf8a702-e47a-4fc7-a9a8-3a7e63525294 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG nova.compute.manager [req-c00d7f3c-ff71-4e5d-9399-67fc69730a7a req-cf9e0e51-233f-4a04-baf5-370e7a5c7a02 service nova] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Refreshing instance network info cache due to event network-changed-aaf8a702-e47a-4fc7-a9a8-3a7e63525294. {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 23 03:49:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-c00d7f3c-ff71-4e5d-9399-67fc69730a7a req-cf9e0e51-233f-4a04-baf5-370e7a5c7a02 service nova] Acquiring lock "refresh_cache-6ee9a44e-e6af-4eec-975c-3991146ce71b" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG nova.network.neutron [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Updating instance_info_cache with network_info: [{"id": "aaf8a702-e47a-4fc7-a9a8-3a7e63525294", "address": "fa:16:3e:a3:c1:eb", "network": {"id": "5604c946-dd68-4517-9c39-b4d33d86a345", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1424127131-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70b031ddc5c94ca98e7161de03bda4b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapaaf8a702-e4", "ovs_interfaceid": "aaf8a702-e47a-4fc7-a9a8-3a7e63525294", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Releasing lock "refresh_cache-6ee9a44e-e6af-4eec-975c-3991146ce71b" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG nova.compute.manager [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Instance network_info: |[{"id": "aaf8a702-e47a-4fc7-a9a8-3a7e63525294", "address": "fa:16:3e:a3:c1:eb", "network": {"id": "5604c946-dd68-4517-9c39-b4d33d86a345", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1424127131-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70b031ddc5c94ca98e7161de03bda4b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapaaf8a702-e4", "ovs_interfaceid": "aaf8a702-e47a-4fc7-a9a8-3a7e63525294", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-c00d7f3c-ff71-4e5d-9399-67fc69730a7a req-cf9e0e51-233f-4a04-baf5-370e7a5c7a02 service nova] Acquired lock "refresh_cache-6ee9a44e-e6af-4eec-975c-3991146ce71b" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG nova.network.neutron [req-c00d7f3c-ff71-4e5d-9399-67fc69730a7a req-cf9e0e51-233f-4a04-baf5-370e7a5c7a02 service nova] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Refreshing network info cache for port aaf8a702-e47a-4fc7-a9a8-3a7e63525294 {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Start _get_guest_xml network_info=[{"id": "aaf8a702-e47a-4fc7-a9a8-3a7e63525294", "address": "fa:16:3e:a3:c1:eb", "network": {"id": "5604c946-dd68-4517-9c39-b4d33d86a345", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1424127131-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70b031ddc5c94ca98e7161de03bda4b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapaaf8a702-e4", "ovs_interfaceid": "aaf8a702-e47a-4fc7-a9a8-3a7e63525294", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'image_id': 'e6127373-9931-4277-9458-eceef653ea1e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 23 03:49:16 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:49:16 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:49:16 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71428) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-23T03:43:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=), allow threads: True {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Flavor limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Image limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Flavor pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Image pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Got 1 possible topologies {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:49:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-95239721',display_name='tempest-AttachVolumeTestJSON-server-95239721',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-95239721',id=10,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKEbelubDhWYxOcMjX0yw/rKDeobxDpM94ii4rSjeoKkqXPZBE/gk0LSlgoNggx8xKW/qBeJcmHECpzHOVf8zkPxD/oth6aCuNZeiupH6KLcLo/umHtjWJDWCvB+mcrBEQ==',key_name='tempest-keypair-1779422949',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70b031ddc5c94ca98e7161de03bda4b7',ramdisk_id='',reservation_id='r-sdoft0x0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-276721084',owner_user_name='tempest-AttachVolumeTestJSON-276721084-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:49:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e3d689f1c160478ca83bbff3104d8ec3',uuid=6ee9a44e-e6af-4eec-975c-3991146ce71b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "aaf8a702-e47a-4fc7-a9a8-3a7e63525294", "address": "fa:16:3e:a3:c1:eb", "network": {"id": "5604c946-dd68-4517-9c39-b4d33d86a345", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1424127131-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70b031ddc5c94ca98e7161de03bda4b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapaaf8a702-e4", "ovs_interfaceid": "aaf8a702-e47a-4fc7-a9a8-3a7e63525294", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71428) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Converting VIF {"id": "aaf8a702-e47a-4fc7-a9a8-3a7e63525294", "address": "fa:16:3e:a3:c1:eb", "network": {"id": "5604c946-dd68-4517-9c39-b4d33d86a345", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1424127131-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70b031ddc5c94ca98e7161de03bda4b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapaaf8a702-e4", "ovs_interfaceid": "aaf8a702-e47a-4fc7-a9a8-3a7e63525294", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a3:c1:eb,bridge_name='br-int',has_traffic_filtering=True,id=aaf8a702-e47a-4fc7-a9a8-3a7e63525294,network=Network(5604c946-dd68-4517-9c39-b4d33d86a345),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaaf8a702-e4') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG nova.objects.instance [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lazy-loading 'pci_devices' on Instance uuid 6ee9a44e-e6af-4eec-975c-3991146ce71b {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] End _get_guest_xml xml= Apr 23 03:49:16 user nova-compute[71428]: 6ee9a44e-e6af-4eec-975c-3991146ce71b Apr 23 03:49:16 user nova-compute[71428]: instance-0000000a Apr 23 03:49:16 user nova-compute[71428]: 131072 Apr 23 03:49:16 user nova-compute[71428]: 1 Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: tempest-AttachVolumeTestJSON-server-95239721 Apr 23 03:49:16 user nova-compute[71428]: 2023-04-23 03:49:16 Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: 128 Apr 23 03:49:16 user nova-compute[71428]: 1 Apr 23 03:49:16 user nova-compute[71428]: 0 Apr 23 03:49:16 user nova-compute[71428]: 0 Apr 23 03:49:16 user nova-compute[71428]: 1 Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: tempest-AttachVolumeTestJSON-276721084-project-member Apr 23 03:49:16 user nova-compute[71428]: tempest-AttachVolumeTestJSON-276721084 Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: OpenStack Foundation Apr 23 03:49:16 user nova-compute[71428]: OpenStack Nova Apr 23 03:49:16 user nova-compute[71428]: 0.0.0 Apr 23 03:49:16 user nova-compute[71428]: 6ee9a44e-e6af-4eec-975c-3991146ce71b Apr 23 03:49:16 user nova-compute[71428]: 6ee9a44e-e6af-4eec-975c-3991146ce71b Apr 23 03:49:16 user nova-compute[71428]: Virtual Machine Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: hvm Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Nehalem Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: /dev/urandom Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: Apr 23 03:49:16 user nova-compute[71428]: {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:49:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-95239721',display_name='tempest-AttachVolumeTestJSON-server-95239721',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-95239721',id=10,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKEbelubDhWYxOcMjX0yw/rKDeobxDpM94ii4rSjeoKkqXPZBE/gk0LSlgoNggx8xKW/qBeJcmHECpzHOVf8zkPxD/oth6aCuNZeiupH6KLcLo/umHtjWJDWCvB+mcrBEQ==',key_name='tempest-keypair-1779422949',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70b031ddc5c94ca98e7161de03bda4b7',ramdisk_id='',reservation_id='r-sdoft0x0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-276721084',owner_user_name='tempest-AttachVolumeTestJSON-276721084-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:49:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e3d689f1c160478ca83bbff3104d8ec3',uuid=6ee9a44e-e6af-4eec-975c-3991146ce71b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "aaf8a702-e47a-4fc7-a9a8-3a7e63525294", "address": "fa:16:3e:a3:c1:eb", "network": {"id": "5604c946-dd68-4517-9c39-b4d33d86a345", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1424127131-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70b031ddc5c94ca98e7161de03bda4b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapaaf8a702-e4", "ovs_interfaceid": "aaf8a702-e47a-4fc7-a9a8-3a7e63525294", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Converting VIF {"id": "aaf8a702-e47a-4fc7-a9a8-3a7e63525294", "address": "fa:16:3e:a3:c1:eb", "network": {"id": "5604c946-dd68-4517-9c39-b4d33d86a345", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1424127131-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70b031ddc5c94ca98e7161de03bda4b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapaaf8a702-e4", "ovs_interfaceid": "aaf8a702-e47a-4fc7-a9a8-3a7e63525294", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a3:c1:eb,bridge_name='br-int',has_traffic_filtering=True,id=aaf8a702-e47a-4fc7-a9a8-3a7e63525294,network=Network(5604c946-dd68-4517-9c39-b4d33d86a345),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaaf8a702-e4') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG os_vif [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a3:c1:eb,bridge_name='br-int',has_traffic_filtering=True,id=aaf8a702-e47a-4fc7-a9a8-3a7e63525294,network=Network(5604c946-dd68-4517-9c39-b4d33d86a345),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaaf8a702-e4') {{(pid=71428) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaaf8a702-e4, may_exist=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapaaf8a702-e4, col_values=(('external_ids', {'iface-id': 'aaf8a702-e47a-4fc7-a9a8-3a7e63525294', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a3:c1:eb', 'vm-uuid': '6ee9a44e-e6af-4eec-975c-3991146ce71b'}),)) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:16 user nova-compute[71428]: INFO os_vif [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a3:c1:eb,bridge_name='br-int',has_traffic_filtering=True,id=aaf8a702-e47a-4fc7-a9a8-3a7e63525294,network=Network(5604c946-dd68-4517-9c39-b4d33d86a345),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaaf8a702-e4') Apr 23 03:49:16 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] No BDM found with device name vda, not building metadata. {{(pid=71428) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] No VIF found with MAC fa:16:3e:a3:c1:eb, not building metadata {{(pid=71428) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Starting heal instance info cache {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Didn't find any instances for network info cache update. {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG nova.compute.manager [req-e05a44f1-4539-4faf-9ab2-a9a4909a88cf req-119f6ee8-a686-4993-9712-7cda50f1318f service nova] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Received event network-changed-b7f45c1f-8b9b-4d06-a389-d7643565e934 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG nova.compute.manager [req-e05a44f1-4539-4faf-9ab2-a9a4909a88cf req-119f6ee8-a686-4993-9712-7cda50f1318f service nova] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Refreshing instance network info cache due to event network-changed-b7f45c1f-8b9b-4d06-a389-d7643565e934. {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-e05a44f1-4539-4faf-9ab2-a9a4909a88cf req-119f6ee8-a686-4993-9712-7cda50f1318f service nova] Acquiring lock "refresh_cache-91756f90-733e-4aa5-9108-d2d8b1d020fe" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-e05a44f1-4539-4faf-9ab2-a9a4909a88cf req-119f6ee8-a686-4993-9712-7cda50f1318f service nova] Acquired lock "refresh_cache-91756f90-733e-4aa5-9108-d2d8b1d020fe" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG nova.network.neutron [req-e05a44f1-4539-4faf-9ab2-a9a4909a88cf req-119f6ee8-a686-4993-9712-7cda50f1318f service nova] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Refreshing network info cache for port b7f45c1f-8b9b-4d06-a389-d7643565e934 {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG nova.network.neutron [req-c00d7f3c-ff71-4e5d-9399-67fc69730a7a req-cf9e0e51-233f-4a04-baf5-370e7a5c7a02 service nova] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Updated VIF entry in instance network info cache for port aaf8a702-e47a-4fc7-a9a8-3a7e63525294. {{(pid=71428) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG nova.network.neutron [req-c00d7f3c-ff71-4e5d-9399-67fc69730a7a req-cf9e0e51-233f-4a04-baf5-370e7a5c7a02 service nova] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Updating instance_info_cache with network_info: [{"id": "aaf8a702-e47a-4fc7-a9a8-3a7e63525294", "address": "fa:16:3e:a3:c1:eb", "network": {"id": "5604c946-dd68-4517-9c39-b4d33d86a345", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1424127131-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70b031ddc5c94ca98e7161de03bda4b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapaaf8a702-e4", "ovs_interfaceid": "aaf8a702-e47a-4fc7-a9a8-3a7e63525294", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:49:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-c00d7f3c-ff71-4e5d-9399-67fc69730a7a req-cf9e0e51-233f-4a04-baf5-370e7a5c7a02 service nova] Releasing lock "refresh_cache-6ee9a44e-e6af-4eec-975c-3991146ce71b" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:49:17 user nova-compute[71428]: DEBUG nova.network.neutron [req-e05a44f1-4539-4faf-9ab2-a9a4909a88cf req-119f6ee8-a686-4993-9712-7cda50f1318f service nova] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Updated VIF entry in instance network info cache for port b7f45c1f-8b9b-4d06-a389-d7643565e934. {{(pid=71428) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 23 03:49:17 user nova-compute[71428]: DEBUG nova.network.neutron [req-e05a44f1-4539-4faf-9ab2-a9a4909a88cf req-119f6ee8-a686-4993-9712-7cda50f1318f service nova] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Updating instance_info_cache with network_info: [{"id": "b7f45c1f-8b9b-4d06-a389-d7643565e934", "address": "fa:16:3e:29:a9:72", "network": {"id": "dca689d5-53e5-466b-bd5c-4c29b330d27e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2008984306-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.168", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5eb0a03655cf4aa78e27c81ea4e1c424", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f45c1f-8b", "ovs_interfaceid": "b7f45c1f-8b9b-4d06-a389-d7643565e934", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:49:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-e05a44f1-4539-4faf-9ab2-a9a4909a88cf req-119f6ee8-a686-4993-9712-7cda50f1318f service nova] Releasing lock "refresh_cache-91756f90-733e-4aa5-9108-d2d8b1d020fe" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:49:17 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:17 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:17 user nova-compute[71428]: DEBUG nova.compute.manager [req-07b54934-0dd6-4ee3-b058-c8e3abf17809 req-a6c6f08e-d163-4ccb-b9c5-22ef6e907bcb service nova] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Received event network-vif-plugged-aaf8a702-e47a-4fc7-a9a8-3a7e63525294 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:49:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-07b54934-0dd6-4ee3-b058-c8e3abf17809 req-a6c6f08e-d163-4ccb-b9c5-22ef6e907bcb service nova] Acquiring lock "6ee9a44e-e6af-4eec-975c-3991146ce71b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:49:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-07b54934-0dd6-4ee3-b058-c8e3abf17809 req-a6c6f08e-d163-4ccb-b9c5-22ef6e907bcb service nova] Lock "6ee9a44e-e6af-4eec-975c-3991146ce71b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:49:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-07b54934-0dd6-4ee3-b058-c8e3abf17809 req-a6c6f08e-d163-4ccb-b9c5-22ef6e907bcb service nova] Lock "6ee9a44e-e6af-4eec-975c-3991146ce71b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:49:17 user nova-compute[71428]: DEBUG nova.compute.manager [req-07b54934-0dd6-4ee3-b058-c8e3abf17809 req-a6c6f08e-d163-4ccb-b9c5-22ef6e907bcb service nova] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] No waiting events found dispatching network-vif-plugged-aaf8a702-e47a-4fc7-a9a8-3a7e63525294 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:49:17 user nova-compute[71428]: WARNING nova.compute.manager [req-07b54934-0dd6-4ee3-b058-c8e3abf17809 req-a6c6f08e-d163-4ccb-b9c5-22ef6e907bcb service nova] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Received unexpected event network-vif-plugged-aaf8a702-e47a-4fc7-a9a8-3a7e63525294 for instance with vm_state building and task_state spawning. Apr 23 03:49:18 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:18 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:18 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:18 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Acquiring lock "1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:49:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lock "1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:49:19 user nova-compute[71428]: DEBUG nova.compute.manager [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Starting instance... {{(pid=71428) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 23 03:49:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:49:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:49:19 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71428) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 23 03:49:19 user nova-compute[71428]: INFO nova.compute.claims [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Claim successful on node user Apr 23 03:49:19 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:49:19 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:49:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.424s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:49:19 user nova-compute[71428]: DEBUG nova.compute.manager [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Start building networks asynchronously for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 23 03:49:19 user nova-compute[71428]: DEBUG nova.compute.manager [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Allocating IP information in the background. {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 23 03:49:19 user nova-compute[71428]: DEBUG nova.network.neutron [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] allocate_for_instance() {{(pid=71428) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 23 03:49:19 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 23 03:49:19 user nova-compute[71428]: DEBUG nova.compute.manager [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Start building block device mappings for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 23 03:49:19 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Resumed> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:49:19 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] VM Resumed (Lifecycle Event) Apr 23 03:49:19 user nova-compute[71428]: DEBUG nova.compute.manager [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Instance event wait completed in 0 seconds for {{(pid=71428) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 23 03:49:19 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Guest created on hypervisor {{(pid=71428) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 23 03:49:19 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Instance spawned successfully. Apr 23 03:49:19 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 23 03:49:19 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:49:19 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:49:19 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Found default for hw_cdrom_bus of ide {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:49:19 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Found default for hw_disk_bus of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:49:19 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Found default for hw_input_bus of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:49:19 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Found default for hw_pointer_model of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:49:19 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Found default for hw_video_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:49:19 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Found default for hw_vif_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:49:19 user nova-compute[71428]: DEBUG nova.compute.manager [req-b1a48472-64e2-4797-a21c-7d2caca25d19 req-39be0c9c-3e5e-414b-9c35-af8526dab406 service nova] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Received event network-vif-plugged-aaf8a702-e47a-4fc7-a9a8-3a7e63525294 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:49:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-b1a48472-64e2-4797-a21c-7d2caca25d19 req-39be0c9c-3e5e-414b-9c35-af8526dab406 service nova] Acquiring lock "6ee9a44e-e6af-4eec-975c-3991146ce71b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:49:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-b1a48472-64e2-4797-a21c-7d2caca25d19 req-39be0c9c-3e5e-414b-9c35-af8526dab406 service nova] Lock "6ee9a44e-e6af-4eec-975c-3991146ce71b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:49:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-b1a48472-64e2-4797-a21c-7d2caca25d19 req-39be0c9c-3e5e-414b-9c35-af8526dab406 service nova] Lock "6ee9a44e-e6af-4eec-975c-3991146ce71b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:49:19 user nova-compute[71428]: DEBUG nova.compute.manager [req-b1a48472-64e2-4797-a21c-7d2caca25d19 req-39be0c9c-3e5e-414b-9c35-af8526dab406 service nova] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] No waiting events found dispatching network-vif-plugged-aaf8a702-e47a-4fc7-a9a8-3a7e63525294 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:49:19 user nova-compute[71428]: WARNING nova.compute.manager [req-b1a48472-64e2-4797-a21c-7d2caca25d19 req-39be0c9c-3e5e-414b-9c35-af8526dab406 service nova] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Received unexpected event network-vif-plugged-aaf8a702-e47a-4fc7-a9a8-3a7e63525294 for instance with vm_state building and task_state spawning. Apr 23 03:49:19 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 03:49:19 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Started> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:49:19 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] VM Started (Lifecycle Event) Apr 23 03:49:19 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:49:19 user nova-compute[71428]: DEBUG nova.policy [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3e80c354abd34bd3a28ddaeec9535af2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5eb0a03655cf4aa78e27c81ea4e1c424', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71428) authorize /opt/stack/nova/nova/policy.py:203}} Apr 23 03:49:19 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:49:19 user nova-compute[71428]: INFO nova.compute.manager [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Took 8.07 seconds to spawn the instance on the hypervisor. Apr 23 03:49:19 user nova-compute[71428]: DEBUG nova.compute.manager [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:49:19 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 03:49:19 user nova-compute[71428]: DEBUG nova.compute.manager [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Start spawning the instance on the hypervisor. {{(pid=71428) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 23 03:49:19 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Creating instance directory {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 23 03:49:19 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Creating image(s) Apr 23 03:49:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Acquiring lock "/opt/stack/data/nova/instances/1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:49:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lock "/opt/stack/data/nova/instances/1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:49:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lock "/opt/stack/data/nova/instances/1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:49:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:49:20 user nova-compute[71428]: INFO nova.compute.manager [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Took 9.03 seconds to build instance. Apr 23 03:49:20 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-a9ec8fdc-66e5-456a-93af-a260dbdbaabc tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lock "6ee9a44e-e6af-4eec-975c-3991146ce71b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 9.162s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:49:20 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.149s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:49:20 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Acquiring lock "935dc3474dddbde110531a31510941caddc0ae83" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:49:20 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:49:20 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:49:20 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.157s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:49:20 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86/disk 1073741824 {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:49:20 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86/disk 1073741824" returned: 0 in 0.052s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:49:20 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.218s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:49:20 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:49:20 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.152s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:49:20 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Checking if we can resize image /opt/stack/data/nova/instances/1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86/disk. size=1073741824 {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 23 03:49:20 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:49:20 user nova-compute[71428]: DEBUG nova.network.neutron [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Successfully created port: dbcb6234-6b80-49f9-b7a1-0f339da7084a {{(pid=71428) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 23 03:49:20 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86/disk --force-share --output=json" returned: 0 in 0.165s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:49:20 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Cannot resize image /opt/stack/data/nova/instances/1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86/disk to a smaller size. {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 23 03:49:20 user nova-compute[71428]: DEBUG nova.objects.instance [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lazy-loading 'migration_context' on Instance uuid 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:49:20 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Created local disks {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 23 03:49:20 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Ensure instance console log exists: /opt/stack/data/nova/instances/1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86/console.log {{(pid=71428) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 23 03:49:20 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:49:20 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:49:20 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:49:21 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:21 user nova-compute[71428]: DEBUG nova.network.neutron [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Successfully updated port: dbcb6234-6b80-49f9-b7a1-0f339da7084a {{(pid=71428) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 23 03:49:21 user nova-compute[71428]: DEBUG nova.compute.manager [req-d143aafe-e574-412a-9742-2cdea5b03e06 req-90f3c6c4-3518-4bde-b747-84388988f901 service nova] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Received event network-changed-dbcb6234-6b80-49f9-b7a1-0f339da7084a {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:49:21 user nova-compute[71428]: DEBUG nova.compute.manager [req-d143aafe-e574-412a-9742-2cdea5b03e06 req-90f3c6c4-3518-4bde-b747-84388988f901 service nova] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Refreshing instance network info cache due to event network-changed-dbcb6234-6b80-49f9-b7a1-0f339da7084a. {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 23 03:49:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-d143aafe-e574-412a-9742-2cdea5b03e06 req-90f3c6c4-3518-4bde-b747-84388988f901 service nova] Acquiring lock "refresh_cache-1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:49:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-d143aafe-e574-412a-9742-2cdea5b03e06 req-90f3c6c4-3518-4bde-b747-84388988f901 service nova] Acquired lock "refresh_cache-1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:49:21 user nova-compute[71428]: DEBUG nova.network.neutron [req-d143aafe-e574-412a-9742-2cdea5b03e06 req-90f3c6c4-3518-4bde-b747-84388988f901 service nova] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Refreshing network info cache for port dbcb6234-6b80-49f9-b7a1-0f339da7084a {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 23 03:49:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Acquiring lock "refresh_cache-1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:49:21 user nova-compute[71428]: DEBUG nova.network.neutron [req-d143aafe-e574-412a-9742-2cdea5b03e06 req-90f3c6c4-3518-4bde-b747-84388988f901 service nova] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Instance cache missing network info. {{(pid=71428) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 23 03:49:21 user nova-compute[71428]: DEBUG nova.network.neutron [req-d143aafe-e574-412a-9742-2cdea5b03e06 req-90f3c6c4-3518-4bde-b747-84388988f901 service nova] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Updating instance_info_cache with network_info: [] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:49:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-d143aafe-e574-412a-9742-2cdea5b03e06 req-90f3c6c4-3518-4bde-b747-84388988f901 service nova] Releasing lock "refresh_cache-1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:49:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Acquired lock "refresh_cache-1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:49:21 user nova-compute[71428]: DEBUG nova.network.neutron [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Building network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 23 03:49:21 user nova-compute[71428]: DEBUG nova.network.neutron [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Instance cache missing network info. {{(pid=71428) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 23 03:49:21 user nova-compute[71428]: DEBUG nova.network.neutron [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Updating instance_info_cache with network_info: [{"id": "dbcb6234-6b80-49f9-b7a1-0f339da7084a", "address": "fa:16:3e:eb:b4:74", "network": {"id": "dca689d5-53e5-466b-bd5c-4c29b330d27e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2008984306-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5eb0a03655cf4aa78e27c81ea4e1c424", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdbcb6234-6b", "ovs_interfaceid": "dbcb6234-6b80-49f9-b7a1-0f339da7084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:49:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Releasing lock "refresh_cache-1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:49:21 user nova-compute[71428]: DEBUG nova.compute.manager [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Instance network_info: |[{"id": "dbcb6234-6b80-49f9-b7a1-0f339da7084a", "address": "fa:16:3e:eb:b4:74", "network": {"id": "dca689d5-53e5-466b-bd5c-4c29b330d27e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2008984306-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5eb0a03655cf4aa78e27c81ea4e1c424", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdbcb6234-6b", "ovs_interfaceid": "dbcb6234-6b80-49f9-b7a1-0f339da7084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 23 03:49:21 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Start _get_guest_xml network_info=[{"id": "dbcb6234-6b80-49f9-b7a1-0f339da7084a", "address": "fa:16:3e:eb:b4:74", "network": {"id": "dca689d5-53e5-466b-bd5c-4c29b330d27e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2008984306-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5eb0a03655cf4aa78e27c81ea4e1c424", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdbcb6234-6b", "ovs_interfaceid": "dbcb6234-6b80-49f9-b7a1-0f339da7084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'image_id': 'e6127373-9931-4277-9458-eceef653ea1e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 23 03:49:21 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:49:21 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:49:21 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71428) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 23 03:49:21 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-23T03:43:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=), allow threads: True {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 23 03:49:21 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Flavor limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 23 03:49:21 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Image limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 23 03:49:21 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Flavor pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 23 03:49:21 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Image pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 23 03:49:21 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 23 03:49:21 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 23 03:49:21 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 23 03:49:21 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Got 1 possible topologies {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 23 03:49:21 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 23 03:49:21 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 23 03:49:22 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:49:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1265143385',display_name='tempest-VolumesAdminNegativeTest-server-1265143385',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-1265143385',id=11,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5eb0a03655cf4aa78e27c81ea4e1c424',ramdisk_id='',reservation_id='r-ttjiy9bj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-594073230',owner_user_name='tempest-VolumesAdminNegativeTest-594073230-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:49:20Z,user_data=None,user_id='3e80c354abd34bd3a28ddaeec9535af2',uuid=1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dbcb6234-6b80-49f9-b7a1-0f339da7084a", "address": "fa:16:3e:eb:b4:74", "network": {"id": "dca689d5-53e5-466b-bd5c-4c29b330d27e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2008984306-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5eb0a03655cf4aa78e27c81ea4e1c424", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdbcb6234-6b", "ovs_interfaceid": "dbcb6234-6b80-49f9-b7a1-0f339da7084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71428) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 23 03:49:22 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Converting VIF {"id": "dbcb6234-6b80-49f9-b7a1-0f339da7084a", "address": "fa:16:3e:eb:b4:74", "network": {"id": "dca689d5-53e5-466b-bd5c-4c29b330d27e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2008984306-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5eb0a03655cf4aa78e27c81ea4e1c424", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdbcb6234-6b", "ovs_interfaceid": "dbcb6234-6b80-49f9-b7a1-0f339da7084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:49:22 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:b4:74,bridge_name='br-int',has_traffic_filtering=True,id=dbcb6234-6b80-49f9-b7a1-0f339da7084a,network=Network(dca689d5-53e5-466b-bd5c-4c29b330d27e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdbcb6234-6b') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:49:22 user nova-compute[71428]: DEBUG nova.objects.instance [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lazy-loading 'pci_devices' on Instance uuid 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:49:22 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] End _get_guest_xml xml= Apr 23 03:49:22 user nova-compute[71428]: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86 Apr 23 03:49:22 user nova-compute[71428]: instance-0000000b Apr 23 03:49:22 user nova-compute[71428]: 131072 Apr 23 03:49:22 user nova-compute[71428]: 1 Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: tempest-VolumesAdminNegativeTest-server-1265143385 Apr 23 03:49:22 user nova-compute[71428]: 2023-04-23 03:49:21 Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: 128 Apr 23 03:49:22 user nova-compute[71428]: 1 Apr 23 03:49:22 user nova-compute[71428]: 0 Apr 23 03:49:22 user nova-compute[71428]: 0 Apr 23 03:49:22 user nova-compute[71428]: 1 Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: tempest-VolumesAdminNegativeTest-594073230-project-member Apr 23 03:49:22 user nova-compute[71428]: tempest-VolumesAdminNegativeTest-594073230 Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: OpenStack Foundation Apr 23 03:49:22 user nova-compute[71428]: OpenStack Nova Apr 23 03:49:22 user nova-compute[71428]: 0.0.0 Apr 23 03:49:22 user nova-compute[71428]: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86 Apr 23 03:49:22 user nova-compute[71428]: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86 Apr 23 03:49:22 user nova-compute[71428]: Virtual Machine Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: hvm Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Nehalem Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: /dev/urandom Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: Apr 23 03:49:22 user nova-compute[71428]: {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 23 03:49:22 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:49:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1265143385',display_name='tempest-VolumesAdminNegativeTest-server-1265143385',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-1265143385',id=11,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5eb0a03655cf4aa78e27c81ea4e1c424',ramdisk_id='',reservation_id='r-ttjiy9bj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-594073230',owner_user_name='tempest-VolumesAdminNegativeTest-594073230-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:49:20Z,user_data=None,user_id='3e80c354abd34bd3a28ddaeec9535af2',uuid=1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dbcb6234-6b80-49f9-b7a1-0f339da7084a", "address": "fa:16:3e:eb:b4:74", "network": {"id": "dca689d5-53e5-466b-bd5c-4c29b330d27e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2008984306-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5eb0a03655cf4aa78e27c81ea4e1c424", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdbcb6234-6b", "ovs_interfaceid": "dbcb6234-6b80-49f9-b7a1-0f339da7084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 23 03:49:22 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Converting VIF {"id": "dbcb6234-6b80-49f9-b7a1-0f339da7084a", "address": "fa:16:3e:eb:b4:74", "network": {"id": "dca689d5-53e5-466b-bd5c-4c29b330d27e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2008984306-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5eb0a03655cf4aa78e27c81ea4e1c424", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdbcb6234-6b", "ovs_interfaceid": "dbcb6234-6b80-49f9-b7a1-0f339da7084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:49:22 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:b4:74,bridge_name='br-int',has_traffic_filtering=True,id=dbcb6234-6b80-49f9-b7a1-0f339da7084a,network=Network(dca689d5-53e5-466b-bd5c-4c29b330d27e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdbcb6234-6b') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:49:22 user nova-compute[71428]: DEBUG os_vif [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:b4:74,bridge_name='br-int',has_traffic_filtering=True,id=dbcb6234-6b80-49f9-b7a1-0f339da7084a,network=Network(dca689d5-53e5-466b-bd5c-4c29b330d27e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdbcb6234-6b') {{(pid=71428) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 23 03:49:22 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:22 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:49:22 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 23 03:49:22 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:22 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdbcb6234-6b, may_exist=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:49:22 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdbcb6234-6b, col_values=(('external_ids', {'iface-id': 'dbcb6234-6b80-49f9-b7a1-0f339da7084a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:eb:b4:74', 'vm-uuid': '1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86'}),)) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:49:22 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:22 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:49:22 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:22 user nova-compute[71428]: INFO os_vif [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:b4:74,bridge_name='br-int',has_traffic_filtering=True,id=dbcb6234-6b80-49f9-b7a1-0f339da7084a,network=Network(dca689d5-53e5-466b-bd5c-4c29b330d27e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdbcb6234-6b') Apr 23 03:49:22 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] No BDM found with device name vda, not building metadata. {{(pid=71428) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 23 03:49:22 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] No VIF found with MAC fa:16:3e:eb:b4:74, not building metadata {{(pid=71428) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 23 03:49:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:23 user nova-compute[71428]: DEBUG nova.compute.manager [req-67402369-f10d-4fac-9196-932baf15100c req-c04c8b77-dbe0-460e-b159-4a249283318a service nova] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Received event network-vif-plugged-dbcb6234-6b80-49f9-b7a1-0f339da7084a {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:49:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-67402369-f10d-4fac-9196-932baf15100c req-c04c8b77-dbe0-460e-b159-4a249283318a service nova] Acquiring lock "1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:49:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-67402369-f10d-4fac-9196-932baf15100c req-c04c8b77-dbe0-460e-b159-4a249283318a service nova] Lock "1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.002s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:49:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-67402369-f10d-4fac-9196-932baf15100c req-c04c8b77-dbe0-460e-b159-4a249283318a service nova] Lock "1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:49:23 user nova-compute[71428]: DEBUG nova.compute.manager [req-67402369-f10d-4fac-9196-932baf15100c req-c04c8b77-dbe0-460e-b159-4a249283318a service nova] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] No waiting events found dispatching network-vif-plugged-dbcb6234-6b80-49f9-b7a1-0f339da7084a {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:49:23 user nova-compute[71428]: WARNING nova.compute.manager [req-67402369-f10d-4fac-9196-932baf15100c req-c04c8b77-dbe0-460e-b159-4a249283318a service nova] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Received unexpected event network-vif-plugged-dbcb6234-6b80-49f9-b7a1-0f339da7084a for instance with vm_state building and task_state spawning. Apr 23 03:49:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:25 user nova-compute[71428]: DEBUG nova.compute.manager [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Instance event wait completed in 0 seconds for {{(pid=71428) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 23 03:49:25 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Guest created on hypervisor {{(pid=71428) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 23 03:49:25 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Resumed> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:49:25 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] VM Resumed (Lifecycle Event) Apr 23 03:49:25 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Instance spawned successfully. Apr 23 03:49:25 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 23 03:49:25 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:49:25 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:49:25 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Found default for hw_cdrom_bus of ide {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:49:25 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Found default for hw_disk_bus of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:49:25 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Found default for hw_input_bus of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:49:25 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Found default for hw_pointer_model of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:49:25 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Found default for hw_video_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:49:25 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Found default for hw_vif_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:49:25 user nova-compute[71428]: DEBUG nova.compute.manager [req-05201324-95b0-4cb2-ab33-fd603a36c1f9 req-7b47a4a1-ce37-42aa-bc67-d55d37b81742 service nova] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Received event network-vif-plugged-dbcb6234-6b80-49f9-b7a1-0f339da7084a {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:49:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-05201324-95b0-4cb2-ab33-fd603a36c1f9 req-7b47a4a1-ce37-42aa-bc67-d55d37b81742 service nova] Acquiring lock "1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:49:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-05201324-95b0-4cb2-ab33-fd603a36c1f9 req-7b47a4a1-ce37-42aa-bc67-d55d37b81742 service nova] Lock "1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:49:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-05201324-95b0-4cb2-ab33-fd603a36c1f9 req-7b47a4a1-ce37-42aa-bc67-d55d37b81742 service nova] Lock "1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:49:25 user nova-compute[71428]: DEBUG nova.compute.manager [req-05201324-95b0-4cb2-ab33-fd603a36c1f9 req-7b47a4a1-ce37-42aa-bc67-d55d37b81742 service nova] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] No waiting events found dispatching network-vif-plugged-dbcb6234-6b80-49f9-b7a1-0f339da7084a {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:49:25 user nova-compute[71428]: WARNING nova.compute.manager [req-05201324-95b0-4cb2-ab33-fd603a36c1f9 req-7b47a4a1-ce37-42aa-bc67-d55d37b81742 service nova] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Received unexpected event network-vif-plugged-dbcb6234-6b80-49f9-b7a1-0f339da7084a for instance with vm_state building and task_state spawning. Apr 23 03:49:25 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 03:49:25 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Started> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:49:25 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] VM Started (Lifecycle Event) Apr 23 03:49:25 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:49:25 user nova-compute[71428]: INFO nova.compute.manager [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Took 5.82 seconds to spawn the instance on the hypervisor. Apr 23 03:49:25 user nova-compute[71428]: DEBUG nova.compute.manager [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:49:25 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:49:25 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 03:49:25 user nova-compute[71428]: INFO nova.compute.manager [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Took 6.60 seconds to build instance. Apr 23 03:49:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-2f5331cf-5248-4779-ace7-796a35f344aa tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lock "1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.692s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:49:25 user nova-compute[71428]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:49:25 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] VM Stopped (Lifecycle Event) Apr 23 03:49:25 user nova-compute[71428]: DEBUG nova.compute.manager [None req-f99a8c01-15c0-4b1e-a1ab-c96646ec6b51 None None] [instance: 3ec36a95-88ce-4ed1-9726-7e6d98674dec] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:49:27 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:27 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-40fb21b4-cbee-43b6-ba73-5319bf398107 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Acquiring lock "6f7cf091-3140-4746-bd1c-95c42ea82f6e" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:49:27 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-40fb21b4-cbee-43b6-ba73-5319bf398107 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lock "6f7cf091-3140-4746-bd1c-95c42ea82f6e" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:49:27 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-40fb21b4-cbee-43b6-ba73-5319bf398107 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Acquiring lock "6f7cf091-3140-4746-bd1c-95c42ea82f6e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:49:27 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-40fb21b4-cbee-43b6-ba73-5319bf398107 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lock "6f7cf091-3140-4746-bd1c-95c42ea82f6e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:49:27 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-40fb21b4-cbee-43b6-ba73-5319bf398107 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lock "6f7cf091-3140-4746-bd1c-95c42ea82f6e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:49:27 user nova-compute[71428]: INFO nova.compute.manager [None req-40fb21b4-cbee-43b6-ba73-5319bf398107 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Terminating instance Apr 23 03:49:27 user nova-compute[71428]: DEBUG nova.compute.manager [None req-40fb21b4-cbee-43b6-ba73-5319bf398107 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Start destroying the instance on the hypervisor. {{(pid=71428) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 23 03:49:27 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:27 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:28 user nova-compute[71428]: DEBUG nova.compute.manager [req-91767a31-cfec-442e-8b64-d7349de291e9 req-79652b74-fcca-4c8e-abcb-407336ce2c0a service nova] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Received event network-vif-unplugged-fcf25b83-715e-4e6e-a7a3-d7f94072883f {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:49:28 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-91767a31-cfec-442e-8b64-d7349de291e9 req-79652b74-fcca-4c8e-abcb-407336ce2c0a service nova] Acquiring lock "6f7cf091-3140-4746-bd1c-95c42ea82f6e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:49:28 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-91767a31-cfec-442e-8b64-d7349de291e9 req-79652b74-fcca-4c8e-abcb-407336ce2c0a service nova] Lock "6f7cf091-3140-4746-bd1c-95c42ea82f6e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:49:28 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-91767a31-cfec-442e-8b64-d7349de291e9 req-79652b74-fcca-4c8e-abcb-407336ce2c0a service nova] Lock "6f7cf091-3140-4746-bd1c-95c42ea82f6e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:49:28 user nova-compute[71428]: DEBUG nova.compute.manager [req-91767a31-cfec-442e-8b64-d7349de291e9 req-79652b74-fcca-4c8e-abcb-407336ce2c0a service nova] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] No waiting events found dispatching network-vif-unplugged-fcf25b83-715e-4e6e-a7a3-d7f94072883f {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:49:28 user nova-compute[71428]: DEBUG nova.compute.manager [req-91767a31-cfec-442e-8b64-d7349de291e9 req-79652b74-fcca-4c8e-abcb-407336ce2c0a service nova] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Received event network-vif-unplugged-fcf25b83-715e-4e6e-a7a3-d7f94072883f for instance with task_state deleting. {{(pid=71428) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 23 03:49:28 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:28 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:28 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Instance destroyed successfully. Apr 23 03:49:28 user nova-compute[71428]: DEBUG nova.objects.instance [None req-40fb21b4-cbee-43b6-ba73-5319bf398107 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lazy-loading 'resources' on Instance uuid 6f7cf091-3140-4746-bd1c-95c42ea82f6e {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:49:28 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:28 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-40fb21b4-cbee-43b6-ba73-5319bf398107 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:47:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-262802774',display_name='tempest-TestMinimumBasicScenario-server-262802774',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-262802774',id=8,image_ref='7c47c35a-5c7e-4aee-95e4-4a8cee90e1cd',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLKn6a8rESzq46bLmR05LGujkm1bgXgZ+9BeETyXSPwaGi5H5ZetPmCwt6N2oLGkFtyS9BduJD7406fOIM9/S2/jcwpt1OzWCv1MZI9XkD4sbEN2KdlWlfO9OLzEy4zxDA==',key_name='tempest-TestMinimumBasicScenario-1120445103',keypairs=,launch_index=0,launched_at=2023-04-23T03:47:42Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='8ab0f01751954d04a83b360b2f839716',ramdisk_id='',reservation_id='r-8tf6frub',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7c47c35a-5c7e-4aee-95e4-4a8cee90e1cd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestMinimumBasicScenario-1558592168',owner_user_name='tempest-TestMinimumBasicScenario-1558592168-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-23T03:47:42Z,user_data=None,user_id='93a8dbfd8cef4578aff742813ffe901e',uuid=6f7cf091-3140-4746-bd1c-95c42ea82f6e,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fcf25b83-715e-4e6e-a7a3-d7f94072883f", "address": "fa:16:3e:65:a9:56", "network": {"id": "a603a26f-3b06-4792-80ae-73cea9f9d53b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1385818779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8ab0f01751954d04a83b360b2f839716", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfcf25b83-71", "ovs_interfaceid": "fcf25b83-715e-4e6e-a7a3-d7f94072883f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 23 03:49:28 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-40fb21b4-cbee-43b6-ba73-5319bf398107 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Converting VIF {"id": "fcf25b83-715e-4e6e-a7a3-d7f94072883f", "address": "fa:16:3e:65:a9:56", "network": {"id": "a603a26f-3b06-4792-80ae-73cea9f9d53b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1385818779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8ab0f01751954d04a83b360b2f839716", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfcf25b83-71", "ovs_interfaceid": "fcf25b83-715e-4e6e-a7a3-d7f94072883f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:49:28 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-40fb21b4-cbee-43b6-ba73-5319bf398107 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:a9:56,bridge_name='br-int',has_traffic_filtering=True,id=fcf25b83-715e-4e6e-a7a3-d7f94072883f,network=Network(a603a26f-3b06-4792-80ae-73cea9f9d53b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfcf25b83-71') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:49:28 user nova-compute[71428]: DEBUG os_vif [None req-40fb21b4-cbee-43b6-ba73-5319bf398107 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:a9:56,bridge_name='br-int',has_traffic_filtering=True,id=fcf25b83-715e-4e6e-a7a3-d7f94072883f,network=Network(a603a26f-3b06-4792-80ae-73cea9f9d53b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfcf25b83-71') {{(pid=71428) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 23 03:49:28 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:28 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfcf25b83-71, bridge=br-int, if_exists=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:49:28 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:28 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:49:28 user nova-compute[71428]: INFO os_vif [None req-40fb21b4-cbee-43b6-ba73-5319bf398107 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:a9:56,bridge_name='br-int',has_traffic_filtering=True,id=fcf25b83-715e-4e6e-a7a3-d7f94072883f,network=Network(a603a26f-3b06-4792-80ae-73cea9f9d53b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfcf25b83-71') Apr 23 03:49:28 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-40fb21b4-cbee-43b6-ba73-5319bf398107 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Deleting instance files /opt/stack/data/nova/instances/6f7cf091-3140-4746-bd1c-95c42ea82f6e_del Apr 23 03:49:28 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-40fb21b4-cbee-43b6-ba73-5319bf398107 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Deletion of /opt/stack/data/nova/instances/6f7cf091-3140-4746-bd1c-95c42ea82f6e_del complete Apr 23 03:49:28 user nova-compute[71428]: INFO nova.compute.manager [None req-40fb21b4-cbee-43b6-ba73-5319bf398107 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Took 1.00 seconds to destroy the instance on the hypervisor. Apr 23 03:49:28 user nova-compute[71428]: DEBUG oslo.service.loopingcall [None req-40fb21b4-cbee-43b6-ba73-5319bf398107 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71428) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 23 03:49:28 user nova-compute[71428]: DEBUG nova.compute.manager [-] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Deallocating network for instance {{(pid=71428) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 23 03:49:28 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] deallocate_for_instance() {{(pid=71428) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 23 03:49:29 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:29 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:29 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:29 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Updating instance_info_cache with network_info: [] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:49:29 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Took 0.58 seconds to deallocate network for instance. Apr 23 03:49:29 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:29 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:29 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-40fb21b4-cbee-43b6-ba73-5319bf398107 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:49:29 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-40fb21b4-cbee-43b6-ba73-5319bf398107 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:49:29 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-40fb21b4-cbee-43b6-ba73-5319bf398107 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:49:29 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-40fb21b4-cbee-43b6-ba73-5319bf398107 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:49:29 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-40fb21b4-cbee-43b6-ba73-5319bf398107 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.243s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:49:29 user nova-compute[71428]: INFO nova.scheduler.client.report [None req-40fb21b4-cbee-43b6-ba73-5319bf398107 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Deleted allocations for instance 6f7cf091-3140-4746-bd1c-95c42ea82f6e Apr 23 03:49:29 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-40fb21b4-cbee-43b6-ba73-5319bf398107 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lock "6f7cf091-3140-4746-bd1c-95c42ea82f6e" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.012s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:49:30 user nova-compute[71428]: DEBUG nova.compute.manager [req-f928d779-3828-4e76-ae57-599dddd3d46f req-5858d58e-31b8-4ea3-b0b9-cca91e655a6e service nova] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Received event network-vif-plugged-fcf25b83-715e-4e6e-a7a3-d7f94072883f {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:49:30 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-f928d779-3828-4e76-ae57-599dddd3d46f req-5858d58e-31b8-4ea3-b0b9-cca91e655a6e service nova] Acquiring lock "6f7cf091-3140-4746-bd1c-95c42ea82f6e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:49:30 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-f928d779-3828-4e76-ae57-599dddd3d46f req-5858d58e-31b8-4ea3-b0b9-cca91e655a6e service nova] Lock "6f7cf091-3140-4746-bd1c-95c42ea82f6e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:49:30 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-f928d779-3828-4e76-ae57-599dddd3d46f req-5858d58e-31b8-4ea3-b0b9-cca91e655a6e service nova] Lock "6f7cf091-3140-4746-bd1c-95c42ea82f6e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:49:30 user nova-compute[71428]: DEBUG nova.compute.manager [req-f928d779-3828-4e76-ae57-599dddd3d46f req-5858d58e-31b8-4ea3-b0b9-cca91e655a6e service nova] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] No waiting events found dispatching network-vif-plugged-fcf25b83-715e-4e6e-a7a3-d7f94072883f {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:49:30 user nova-compute[71428]: WARNING nova.compute.manager [req-f928d779-3828-4e76-ae57-599dddd3d46f req-5858d58e-31b8-4ea3-b0b9-cca91e655a6e service nova] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Received unexpected event network-vif-plugged-fcf25b83-715e-4e6e-a7a3-d7f94072883f for instance with vm_state deleted and task_state None. Apr 23 03:49:30 user nova-compute[71428]: DEBUG nova.compute.manager [req-f928d779-3828-4e76-ae57-599dddd3d46f req-5858d58e-31b8-4ea3-b0b9-cca91e655a6e service nova] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Received event network-vif-deleted-fcf25b83-715e-4e6e-a7a3-d7f94072883f {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:49:33 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:33 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:41 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:43 user nova-compute[71428]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:49:43 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] VM Stopped (Lifecycle Event) Apr 23 03:49:43 user nova-compute[71428]: DEBUG nova.compute.manager [None req-385d1931-b240-4421-97d4-55f5975316d8 None None] [instance: 6f7cf091-3140-4746-bd1c-95c42ea82f6e] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:49:43 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:43 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:48 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:48 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:53 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:53 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:49:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:01 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:01 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-65bb73b7-e327-4e9a-b28b-00d29524f938 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Acquiring lock "f279f3d3-581d-4d6f-924f-4104ec23832a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:50:01 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-65bb73b7-e327-4e9a-b28b-00d29524f938 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Lock "f279f3d3-581d-4d6f-924f-4104ec23832a" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:50:01 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-65bb73b7-e327-4e9a-b28b-00d29524f938 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Acquiring lock "f279f3d3-581d-4d6f-924f-4104ec23832a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:50:01 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-65bb73b7-e327-4e9a-b28b-00d29524f938 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Lock "f279f3d3-581d-4d6f-924f-4104ec23832a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:50:01 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-65bb73b7-e327-4e9a-b28b-00d29524f938 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Lock "f279f3d3-581d-4d6f-924f-4104ec23832a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:50:01 user nova-compute[71428]: INFO nova.compute.manager [None req-65bb73b7-e327-4e9a-b28b-00d29524f938 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Terminating instance Apr 23 03:50:01 user nova-compute[71428]: DEBUG nova.compute.manager [None req-65bb73b7-e327-4e9a-b28b-00d29524f938 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Start destroying the instance on the hypervisor. {{(pid=71428) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 23 03:50:01 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:01 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:01 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:01 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:01 user nova-compute[71428]: DEBUG nova.compute.manager [req-0c155426-8849-4eb7-9195-b0276817b80a req-0ee2f97a-50e4-4f04-bbb9-5079c563ca98 service nova] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Received event network-vif-unplugged-680802e3-0304-496f-927b-855e4167272b {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:50:01 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-0c155426-8849-4eb7-9195-b0276817b80a req-0ee2f97a-50e4-4f04-bbb9-5079c563ca98 service nova] Acquiring lock "f279f3d3-581d-4d6f-924f-4104ec23832a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:50:01 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-0c155426-8849-4eb7-9195-b0276817b80a req-0ee2f97a-50e4-4f04-bbb9-5079c563ca98 service nova] Lock "f279f3d3-581d-4d6f-924f-4104ec23832a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:50:01 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-0c155426-8849-4eb7-9195-b0276817b80a req-0ee2f97a-50e4-4f04-bbb9-5079c563ca98 service nova] Lock "f279f3d3-581d-4d6f-924f-4104ec23832a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:50:01 user nova-compute[71428]: DEBUG nova.compute.manager [req-0c155426-8849-4eb7-9195-b0276817b80a req-0ee2f97a-50e4-4f04-bbb9-5079c563ca98 service nova] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] No waiting events found dispatching network-vif-unplugged-680802e3-0304-496f-927b-855e4167272b {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:50:01 user nova-compute[71428]: DEBUG nova.compute.manager [req-0c155426-8849-4eb7-9195-b0276817b80a req-0ee2f97a-50e4-4f04-bbb9-5079c563ca98 service nova] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Received event network-vif-unplugged-680802e3-0304-496f-927b-855e4167272b for instance with task_state deleting. {{(pid=71428) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 23 03:50:01 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Instance destroyed successfully. Apr 23 03:50:01 user nova-compute[71428]: DEBUG nova.objects.instance [None req-65bb73b7-e327-4e9a-b28b-00d29524f938 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Lazy-loading 'resources' on Instance uuid f279f3d3-581d-4d6f-924f-4104ec23832a {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:50:01 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-65bb73b7-e327-4e9a-b28b-00d29524f938 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:46:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-329028935',display_name='tempest-ServerStableDeviceRescueTest-server-329028935',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverstabledevicerescuetest-server-329028935',id=2,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJloyUtsCDIs9T1+8g/hMHz6r8pR2SR9PggBoD/KR0LjhZoNK/nEt54tYO6knV7E+845Tt0p37M8EMNHSd7z4jcysmifK+fuISVR4KNyRapKSmK+MZDTAfLCZ5mhzyFU5A==',key_name='tempest-keypair-884305512',keypairs=,launch_index=0,launched_at=2023-04-23T03:46:35Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='5864626fffa7443c800b244471966d65',ramdisk_id='',reservation_id='r-lnebeump',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerStableDeviceRescueTest-847110386',owner_user_name='tempest-ServerStableDeviceRescueTest-847110386-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-23T03:48:13Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3311b6ae249f41269fde041f6e441840',uuid=f279f3d3-581d-4d6f-924f-4104ec23832a,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "680802e3-0304-496f-927b-855e4167272b", "address": "fa:16:3e:45:76:bf", "network": {"id": "2354786d-cb38-48f3-9585-d88f27219024", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-812722065-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.29", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5864626fffa7443c800b244471966d65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap680802e3-03", "ovs_interfaceid": "680802e3-0304-496f-927b-855e4167272b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 23 03:50:01 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-65bb73b7-e327-4e9a-b28b-00d29524f938 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Converting VIF {"id": "680802e3-0304-496f-927b-855e4167272b", "address": "fa:16:3e:45:76:bf", "network": {"id": "2354786d-cb38-48f3-9585-d88f27219024", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-812722065-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.29", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5864626fffa7443c800b244471966d65", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap680802e3-03", "ovs_interfaceid": "680802e3-0304-496f-927b-855e4167272b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:50:01 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-65bb73b7-e327-4e9a-b28b-00d29524f938 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:45:76:bf,bridge_name='br-int',has_traffic_filtering=True,id=680802e3-0304-496f-927b-855e4167272b,network=Network(2354786d-cb38-48f3-9585-d88f27219024),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap680802e3-03') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:50:01 user nova-compute[71428]: DEBUG os_vif [None req-65bb73b7-e327-4e9a-b28b-00d29524f938 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:45:76:bf,bridge_name='br-int',has_traffic_filtering=True,id=680802e3-0304-496f-927b-855e4167272b,network=Network(2354786d-cb38-48f3-9585-d88f27219024),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap680802e3-03') {{(pid=71428) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 23 03:50:01 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:01 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap680802e3-03, bridge=br-int, if_exists=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:50:01 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:01 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:50:01 user nova-compute[71428]: INFO os_vif [None req-65bb73b7-e327-4e9a-b28b-00d29524f938 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:45:76:bf,bridge_name='br-int',has_traffic_filtering=True,id=680802e3-0304-496f-927b-855e4167272b,network=Network(2354786d-cb38-48f3-9585-d88f27219024),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap680802e3-03') Apr 23 03:50:01 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-65bb73b7-e327-4e9a-b28b-00d29524f938 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Deleting instance files /opt/stack/data/nova/instances/f279f3d3-581d-4d6f-924f-4104ec23832a_del Apr 23 03:50:01 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-65bb73b7-e327-4e9a-b28b-00d29524f938 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Deletion of /opt/stack/data/nova/instances/f279f3d3-581d-4d6f-924f-4104ec23832a_del complete Apr 23 03:50:01 user nova-compute[71428]: INFO nova.compute.manager [None req-65bb73b7-e327-4e9a-b28b-00d29524f938 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Took 0.65 seconds to destroy the instance on the hypervisor. Apr 23 03:50:01 user nova-compute[71428]: DEBUG oslo.service.loopingcall [None req-65bb73b7-e327-4e9a-b28b-00d29524f938 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71428) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 23 03:50:01 user nova-compute[71428]: DEBUG nova.compute.manager [-] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Deallocating network for instance {{(pid=71428) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 23 03:50:01 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] deallocate_for_instance() {{(pid=71428) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 23 03:50:02 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Updating instance_info_cache with network_info: [] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:50:02 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Took 1.05 seconds to deallocate network for instance. Apr 23 03:50:02 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-65bb73b7-e327-4e9a-b28b-00d29524f938 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:50:02 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-65bb73b7-e327-4e9a-b28b-00d29524f938 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:50:03 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-65bb73b7-e327-4e9a-b28b-00d29524f938 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:50:03 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-65bb73b7-e327-4e9a-b28b-00d29524f938 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:50:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-65bb73b7-e327-4e9a-b28b-00d29524f938 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.234s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:50:03 user nova-compute[71428]: INFO nova.scheduler.client.report [None req-65bb73b7-e327-4e9a-b28b-00d29524f938 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Deleted allocations for instance f279f3d3-581d-4d6f-924f-4104ec23832a Apr 23 03:50:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-65bb73b7-e327-4e9a-b28b-00d29524f938 tempest-ServerStableDeviceRescueTest-847110386 tempest-ServerStableDeviceRescueTest-847110386-project-member] Lock "f279f3d3-581d-4d6f-924f-4104ec23832a" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.124s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:50:03 user nova-compute[71428]: DEBUG nova.compute.manager [req-c2232ed9-086c-4156-84cf-053c07ae6d2d req-90ff6639-8075-4cd3-8af9-54858cf6638a service nova] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Received event network-vif-plugged-680802e3-0304-496f-927b-855e4167272b {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:50:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-c2232ed9-086c-4156-84cf-053c07ae6d2d req-90ff6639-8075-4cd3-8af9-54858cf6638a service nova] Acquiring lock "f279f3d3-581d-4d6f-924f-4104ec23832a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:50:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-c2232ed9-086c-4156-84cf-053c07ae6d2d req-90ff6639-8075-4cd3-8af9-54858cf6638a service nova] Lock "f279f3d3-581d-4d6f-924f-4104ec23832a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:50:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-c2232ed9-086c-4156-84cf-053c07ae6d2d req-90ff6639-8075-4cd3-8af9-54858cf6638a service nova] Lock "f279f3d3-581d-4d6f-924f-4104ec23832a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:50:03 user nova-compute[71428]: DEBUG nova.compute.manager [req-c2232ed9-086c-4156-84cf-053c07ae6d2d req-90ff6639-8075-4cd3-8af9-54858cf6638a service nova] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] No waiting events found dispatching network-vif-plugged-680802e3-0304-496f-927b-855e4167272b {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:50:03 user nova-compute[71428]: WARNING nova.compute.manager [req-c2232ed9-086c-4156-84cf-053c07ae6d2d req-90ff6639-8075-4cd3-8af9-54858cf6638a service nova] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Received unexpected event network-vif-plugged-680802e3-0304-496f-927b-855e4167272b for instance with vm_state deleted and task_state None. Apr 23 03:50:03 user nova-compute[71428]: DEBUG nova.compute.manager [req-c2232ed9-086c-4156-84cf-053c07ae6d2d req-90ff6639-8075-4cd3-8af9-54858cf6638a service nova] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Received event network-vif-deleted-680802e3-0304-496f-927b-855e4167272b {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:50:03 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:04 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Acquiring lock "7c1f1f24-62f5-4e51-8c74-b5b89b6585a4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:50:04 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "7c1f1f24-62f5-4e51-8c74-b5b89b6585a4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:50:04 user nova-compute[71428]: DEBUG nova.compute.manager [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Starting instance... {{(pid=71428) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 23 03:50:04 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:50:04 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:50:04 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71428) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 23 03:50:04 user nova-compute[71428]: INFO nova.compute.claims [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Claim successful on node user Apr 23 03:50:05 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:50:05 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:50:05 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.348s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:50:05 user nova-compute[71428]: DEBUG nova.compute.manager [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Start building networks asynchronously for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 23 03:50:05 user nova-compute[71428]: DEBUG nova.compute.manager [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Allocating IP information in the background. {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 23 03:50:05 user nova-compute[71428]: DEBUG nova.network.neutron [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] allocate_for_instance() {{(pid=71428) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 23 03:50:05 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 23 03:50:05 user nova-compute[71428]: DEBUG nova.compute.manager [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Start building block device mappings for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 23 03:50:05 user nova-compute[71428]: DEBUG nova.compute.manager [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Start spawning the instance on the hypervisor. {{(pid=71428) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 23 03:50:05 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Creating instance directory {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 23 03:50:05 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Creating image(s) Apr 23 03:50:05 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Acquiring lock "/opt/stack/data/nova/instances/7c1f1f24-62f5-4e51-8c74-b5b89b6585a4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:50:05 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "/opt/stack/data/nova/instances/7c1f1f24-62f5-4e51-8c74-b5b89b6585a4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:50:05 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "/opt/stack/data/nova/instances/7c1f1f24-62f5-4e51-8c74-b5b89b6585a4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:50:05 user nova-compute[71428]: DEBUG nova.policy [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0a2c459cad014b07b2613e5e261d88aa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '24fff486a500421397ecb935828582cd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71428) authorize /opt/stack/nova/nova/policy.py:203}} Apr 23 03:50:05 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:50:05 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.142s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:50:05 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Acquiring lock "935dc3474dddbde110531a31510941caddc0ae83" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:50:05 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:50:05 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:50:05 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.154s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:50:05 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/7c1f1f24-62f5-4e51-8c74-b5b89b6585a4/disk 1073741824 {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:50:05 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/7c1f1f24-62f5-4e51-8c74-b5b89b6585a4/disk 1073741824" returned: 0 in 0.053s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:50:05 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.213s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:50:05 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:50:06 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.141s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:50:06 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Checking if we can resize image /opt/stack/data/nova/instances/7c1f1f24-62f5-4e51-8c74-b5b89b6585a4/disk. size=1073741824 {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 23 03:50:06 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7c1f1f24-62f5-4e51-8c74-b5b89b6585a4/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:50:06 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7c1f1f24-62f5-4e51-8c74-b5b89b6585a4/disk --force-share --output=json" returned: 0 in 0.145s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:50:06 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Cannot resize image /opt/stack/data/nova/instances/7c1f1f24-62f5-4e51-8c74-b5b89b6585a4/disk to a smaller size. {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 23 03:50:06 user nova-compute[71428]: DEBUG nova.objects.instance [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lazy-loading 'migration_context' on Instance uuid 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:50:06 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Created local disks {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 23 03:50:06 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Ensure instance console log exists: /opt/stack/data/nova/instances/7c1f1f24-62f5-4e51-8c74-b5b89b6585a4/console.log {{(pid=71428) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 23 03:50:06 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:50:06 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.002s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:50:06 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:50:06 user nova-compute[71428]: DEBUG nova.network.neutron [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Successfully created port: 2a195833-5430-4cc1-a938-5b4157c32300 {{(pid=71428) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 23 03:50:06 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:06 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:50:06 user nova-compute[71428]: DEBUG nova.network.neutron [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Successfully updated port: 2a195833-5430-4cc1-a938-5b4157c32300 {{(pid=71428) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 23 03:50:06 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Acquiring lock "refresh_cache-7c1f1f24-62f5-4e51-8c74-b5b89b6585a4" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:50:06 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Acquired lock "refresh_cache-7c1f1f24-62f5-4e51-8c74-b5b89b6585a4" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:50:06 user nova-compute[71428]: DEBUG nova.network.neutron [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Building network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 23 03:50:06 user nova-compute[71428]: DEBUG nova.compute.manager [req-1e806cd9-346a-4381-81c4-7eaf1ce7e076 req-5c4c5577-1afc-411f-a631-ed0150059205 service nova] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Received event network-changed-2a195833-5430-4cc1-a938-5b4157c32300 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:50:06 user nova-compute[71428]: DEBUG nova.compute.manager [req-1e806cd9-346a-4381-81c4-7eaf1ce7e076 req-5c4c5577-1afc-411f-a631-ed0150059205 service nova] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Refreshing instance network info cache due to event network-changed-2a195833-5430-4cc1-a938-5b4157c32300. {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 23 03:50:06 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-1e806cd9-346a-4381-81c4-7eaf1ce7e076 req-5c4c5577-1afc-411f-a631-ed0150059205 service nova] Acquiring lock "refresh_cache-7c1f1f24-62f5-4e51-8c74-b5b89b6585a4" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:50:06 user nova-compute[71428]: DEBUG nova.network.neutron [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Instance cache missing network info. {{(pid=71428) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 23 03:50:07 user nova-compute[71428]: DEBUG nova.network.neutron [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Updating instance_info_cache with network_info: [{"id": "2a195833-5430-4cc1-a938-5b4157c32300", "address": "fa:16:3e:c3:a4:eb", "network": {"id": "6adb189c-9b46-4da4-a34b-31edf8685498", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-759967402-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "24fff486a500421397ecb935828582cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a195833-54", "ovs_interfaceid": "2a195833-5430-4cc1-a938-5b4157c32300", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:50:07 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Releasing lock "refresh_cache-7c1f1f24-62f5-4e51-8c74-b5b89b6585a4" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:50:07 user nova-compute[71428]: DEBUG nova.compute.manager [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Instance network_info: |[{"id": "2a195833-5430-4cc1-a938-5b4157c32300", "address": "fa:16:3e:c3:a4:eb", "network": {"id": "6adb189c-9b46-4da4-a34b-31edf8685498", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-759967402-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "24fff486a500421397ecb935828582cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a195833-54", "ovs_interfaceid": "2a195833-5430-4cc1-a938-5b4157c32300", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 23 03:50:07 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-1e806cd9-346a-4381-81c4-7eaf1ce7e076 req-5c4c5577-1afc-411f-a631-ed0150059205 service nova] Acquired lock "refresh_cache-7c1f1f24-62f5-4e51-8c74-b5b89b6585a4" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:50:07 user nova-compute[71428]: DEBUG nova.network.neutron [req-1e806cd9-346a-4381-81c4-7eaf1ce7e076 req-5c4c5577-1afc-411f-a631-ed0150059205 service nova] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Refreshing network info cache for port 2a195833-5430-4cc1-a938-5b4157c32300 {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 23 03:50:07 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Start _get_guest_xml network_info=[{"id": "2a195833-5430-4cc1-a938-5b4157c32300", "address": "fa:16:3e:c3:a4:eb", "network": {"id": "6adb189c-9b46-4da4-a34b-31edf8685498", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-759967402-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "24fff486a500421397ecb935828582cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a195833-54", "ovs_interfaceid": "2a195833-5430-4cc1-a938-5b4157c32300", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'image_id': 'e6127373-9931-4277-9458-eceef653ea1e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 23 03:50:07 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:50:07 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:50:07 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71428) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 23 03:50:07 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-23T03:43:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=), allow threads: True {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 23 03:50:07 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Flavor limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 23 03:50:07 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Image limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 23 03:50:07 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Flavor pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 23 03:50:07 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Image pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 23 03:50:07 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 23 03:50:07 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 23 03:50:07 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 23 03:50:07 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Got 1 possible topologies {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 23 03:50:07 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 23 03:50:07 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 23 03:50:07 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:50:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-458473256',display_name='tempest-AttachVolumeNegativeTest-server-458473256',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-458473256',id=12,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPW+NA3vXdFxJXRfoNCWzZ6Pz/y62h/Owo9QJnwMZPgQAIglks3olXUz15miaWu+L14NHfoTREI5jCse5O5kM4puAOyJ3IeJt0XKaKpq44fQskU03DPweyp4+IKVldZSew==',key_name='tempest-keypair-152318519',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='24fff486a500421397ecb935828582cd',ramdisk_id='',reservation_id='r-30oc0m0z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-636753786',owner_user_name='tempest-AttachVolumeNegativeTest-636753786-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:50:05Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0a2c459cad014b07b2613e5e261d88aa',uuid=7c1f1f24-62f5-4e51-8c74-b5b89b6585a4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2a195833-5430-4cc1-a938-5b4157c32300", "address": "fa:16:3e:c3:a4:eb", "network": {"id": "6adb189c-9b46-4da4-a34b-31edf8685498", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-759967402-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "24fff486a500421397ecb935828582cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a195833-54", "ovs_interfaceid": "2a195833-5430-4cc1-a938-5b4157c32300", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71428) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 23 03:50:07 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Converting VIF {"id": "2a195833-5430-4cc1-a938-5b4157c32300", "address": "fa:16:3e:c3:a4:eb", "network": {"id": "6adb189c-9b46-4da4-a34b-31edf8685498", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-759967402-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "24fff486a500421397ecb935828582cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a195833-54", "ovs_interfaceid": "2a195833-5430-4cc1-a938-5b4157c32300", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:50:07 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c3:a4:eb,bridge_name='br-int',has_traffic_filtering=True,id=2a195833-5430-4cc1-a938-5b4157c32300,network=Network(6adb189c-9b46-4da4-a34b-31edf8685498),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a195833-54') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:50:07 user nova-compute[71428]: DEBUG nova.objects.instance [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lazy-loading 'pci_devices' on Instance uuid 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:50:07 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] End _get_guest_xml xml= Apr 23 03:50:07 user nova-compute[71428]: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4 Apr 23 03:50:07 user nova-compute[71428]: instance-0000000c Apr 23 03:50:07 user nova-compute[71428]: 131072 Apr 23 03:50:07 user nova-compute[71428]: 1 Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: tempest-AttachVolumeNegativeTest-server-458473256 Apr 23 03:50:07 user nova-compute[71428]: 2023-04-23 03:50:07 Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: 128 Apr 23 03:50:07 user nova-compute[71428]: 1 Apr 23 03:50:07 user nova-compute[71428]: 0 Apr 23 03:50:07 user nova-compute[71428]: 0 Apr 23 03:50:07 user nova-compute[71428]: 1 Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: tempest-AttachVolumeNegativeTest-636753786-project-member Apr 23 03:50:07 user nova-compute[71428]: tempest-AttachVolumeNegativeTest-636753786 Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: OpenStack Foundation Apr 23 03:50:07 user nova-compute[71428]: OpenStack Nova Apr 23 03:50:07 user nova-compute[71428]: 0.0.0 Apr 23 03:50:07 user nova-compute[71428]: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4 Apr 23 03:50:07 user nova-compute[71428]: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4 Apr 23 03:50:07 user nova-compute[71428]: Virtual Machine Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: hvm Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Nehalem Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: /dev/urandom Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: Apr 23 03:50:07 user nova-compute[71428]: {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 23 03:50:07 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:50:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-458473256',display_name='tempest-AttachVolumeNegativeTest-server-458473256',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-458473256',id=12,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPW+NA3vXdFxJXRfoNCWzZ6Pz/y62h/Owo9QJnwMZPgQAIglks3olXUz15miaWu+L14NHfoTREI5jCse5O5kM4puAOyJ3IeJt0XKaKpq44fQskU03DPweyp4+IKVldZSew==',key_name='tempest-keypair-152318519',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='24fff486a500421397ecb935828582cd',ramdisk_id='',reservation_id='r-30oc0m0z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-636753786',owner_user_name='tempest-AttachVolumeNegativeTest-636753786-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:50:05Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0a2c459cad014b07b2613e5e261d88aa',uuid=7c1f1f24-62f5-4e51-8c74-b5b89b6585a4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2a195833-5430-4cc1-a938-5b4157c32300", "address": "fa:16:3e:c3:a4:eb", "network": {"id": "6adb189c-9b46-4da4-a34b-31edf8685498", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-759967402-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "24fff486a500421397ecb935828582cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a195833-54", "ovs_interfaceid": "2a195833-5430-4cc1-a938-5b4157c32300", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 23 03:50:07 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Converting VIF {"id": "2a195833-5430-4cc1-a938-5b4157c32300", "address": "fa:16:3e:c3:a4:eb", "network": {"id": "6adb189c-9b46-4da4-a34b-31edf8685498", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-759967402-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "24fff486a500421397ecb935828582cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a195833-54", "ovs_interfaceid": "2a195833-5430-4cc1-a938-5b4157c32300", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:50:07 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c3:a4:eb,bridge_name='br-int',has_traffic_filtering=True,id=2a195833-5430-4cc1-a938-5b4157c32300,network=Network(6adb189c-9b46-4da4-a34b-31edf8685498),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a195833-54') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:50:07 user nova-compute[71428]: DEBUG os_vif [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c3:a4:eb,bridge_name='br-int',has_traffic_filtering=True,id=2a195833-5430-4cc1-a938-5b4157c32300,network=Network(6adb189c-9b46-4da4-a34b-31edf8685498),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a195833-54') {{(pid=71428) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 23 03:50:07 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:07 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:50:07 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 23 03:50:07 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:07 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2a195833-54, may_exist=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:50:07 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2a195833-54, col_values=(('external_ids', {'iface-id': '2a195833-5430-4cc1-a938-5b4157c32300', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c3:a4:eb', 'vm-uuid': '7c1f1f24-62f5-4e51-8c74-b5b89b6585a4'}),)) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:50:07 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:07 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:50:07 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:07 user nova-compute[71428]: INFO os_vif [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c3:a4:eb,bridge_name='br-int',has_traffic_filtering=True,id=2a195833-5430-4cc1-a938-5b4157c32300,network=Network(6adb189c-9b46-4da4-a34b-31edf8685498),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a195833-54') Apr 23 03:50:07 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] No BDM found with device name vda, not building metadata. {{(pid=71428) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 23 03:50:07 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] No VIF found with MAC fa:16:3e:c3:a4:eb, not building metadata {{(pid=71428) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 23 03:50:08 user nova-compute[71428]: DEBUG nova.network.neutron [req-1e806cd9-346a-4381-81c4-7eaf1ce7e076 req-5c4c5577-1afc-411f-a631-ed0150059205 service nova] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Updated VIF entry in instance network info cache for port 2a195833-5430-4cc1-a938-5b4157c32300. {{(pid=71428) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 23 03:50:08 user nova-compute[71428]: DEBUG nova.network.neutron [req-1e806cd9-346a-4381-81c4-7eaf1ce7e076 req-5c4c5577-1afc-411f-a631-ed0150059205 service nova] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Updating instance_info_cache with network_info: [{"id": "2a195833-5430-4cc1-a938-5b4157c32300", "address": "fa:16:3e:c3:a4:eb", "network": {"id": "6adb189c-9b46-4da4-a34b-31edf8685498", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-759967402-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "24fff486a500421397ecb935828582cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a195833-54", "ovs_interfaceid": "2a195833-5430-4cc1-a938-5b4157c32300", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:50:08 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-1e806cd9-346a-4381-81c4-7eaf1ce7e076 req-5c4c5577-1afc-411f-a631-ed0150059205 service nova] Releasing lock "refresh_cache-7c1f1f24-62f5-4e51-8c74-b5b89b6585a4" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:50:08 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:50:08 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:50:08 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:08 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:08 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:08 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:09 user nova-compute[71428]: DEBUG nova.compute.manager [req-206e6118-e51b-4890-ba9b-aa0e48465a1d req-6d7ee72b-9d72-4cf9-9489-62345507ece8 service nova] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Received event network-vif-plugged-2a195833-5430-4cc1-a938-5b4157c32300 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:50:09 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-206e6118-e51b-4890-ba9b-aa0e48465a1d req-6d7ee72b-9d72-4cf9-9489-62345507ece8 service nova] Acquiring lock "7c1f1f24-62f5-4e51-8c74-b5b89b6585a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:50:09 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-206e6118-e51b-4890-ba9b-aa0e48465a1d req-6d7ee72b-9d72-4cf9-9489-62345507ece8 service nova] Lock "7c1f1f24-62f5-4e51-8c74-b5b89b6585a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:50:09 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-206e6118-e51b-4890-ba9b-aa0e48465a1d req-6d7ee72b-9d72-4cf9-9489-62345507ece8 service nova] Lock "7c1f1f24-62f5-4e51-8c74-b5b89b6585a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:50:09 user nova-compute[71428]: DEBUG nova.compute.manager [req-206e6118-e51b-4890-ba9b-aa0e48465a1d req-6d7ee72b-9d72-4cf9-9489-62345507ece8 service nova] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] No waiting events found dispatching network-vif-plugged-2a195833-5430-4cc1-a938-5b4157c32300 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:50:09 user nova-compute[71428]: WARNING nova.compute.manager [req-206e6118-e51b-4890-ba9b-aa0e48465a1d req-6d7ee72b-9d72-4cf9-9489-62345507ece8 service nova] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Received unexpected event network-vif-plugged-2a195833-5430-4cc1-a938-5b4157c32300 for instance with vm_state building and task_state spawning. Apr 23 03:50:09 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:09 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:09 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:09 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:50:10 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:50:10 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:50:10 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:50:10 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:50:10 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Auditing locally available compute resources for user (node: user) {{(pid=71428) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 23 03:50:10 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Resumed> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:50:10 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] VM Resumed (Lifecycle Event) Apr 23 03:50:10 user nova-compute[71428]: DEBUG nova.compute.manager [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Instance event wait completed in 0 seconds for {{(pid=71428) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 23 03:50:10 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Guest created on hypervisor {{(pid=71428) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 23 03:50:10 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Instance spawned successfully. Apr 23 03:50:10 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 23 03:50:10 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:50:10 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:50:10 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Found default for hw_cdrom_bus of ide {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:50:10 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Found default for hw_disk_bus of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:50:10 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Found default for hw_input_bus of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:50:10 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Found default for hw_pointer_model of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:50:10 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Found default for hw_video_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:50:10 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Found default for hw_vif_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:50:10 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 03:50:10 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Started> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:50:10 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] VM Started (Lifecycle Event) Apr 23 03:50:10 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:50:10 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:50:10 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 03:50:10 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6ee9a44e-e6af-4eec-975c-3991146ce71b/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:50:11 user nova-compute[71428]: INFO nova.compute.manager [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Took 5.52 seconds to spawn the instance on the hypervisor. Apr 23 03:50:11 user nova-compute[71428]: DEBUG nova.compute.manager [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:50:11 user nova-compute[71428]: DEBUG nova.compute.manager [req-86777a4b-3b54-4cfe-9b3b-ab2d315b3c5d req-eb915423-c35e-4525-b974-c02529915c8d service nova] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Received event network-vif-plugged-2a195833-5430-4cc1-a938-5b4157c32300 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:50:11 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-86777a4b-3b54-4cfe-9b3b-ab2d315b3c5d req-eb915423-c35e-4525-b974-c02529915c8d service nova] Acquiring lock "7c1f1f24-62f5-4e51-8c74-b5b89b6585a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:50:11 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-86777a4b-3b54-4cfe-9b3b-ab2d315b3c5d req-eb915423-c35e-4525-b974-c02529915c8d service nova] Lock "7c1f1f24-62f5-4e51-8c74-b5b89b6585a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:50:11 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-86777a4b-3b54-4cfe-9b3b-ab2d315b3c5d req-eb915423-c35e-4525-b974-c02529915c8d service nova] Lock "7c1f1f24-62f5-4e51-8c74-b5b89b6585a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:50:11 user nova-compute[71428]: DEBUG nova.compute.manager [req-86777a4b-3b54-4cfe-9b3b-ab2d315b3c5d req-eb915423-c35e-4525-b974-c02529915c8d service nova] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] No waiting events found dispatching network-vif-plugged-2a195833-5430-4cc1-a938-5b4157c32300 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:50:11 user nova-compute[71428]: WARNING nova.compute.manager [req-86777a4b-3b54-4cfe-9b3b-ab2d315b3c5d req-eb915423-c35e-4525-b974-c02529915c8d service nova] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Received unexpected event network-vif-plugged-2a195833-5430-4cc1-a938-5b4157c32300 for instance with vm_state building and task_state spawning. Apr 23 03:50:11 user nova-compute[71428]: INFO nova.compute.manager [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Took 6.19 seconds to build instance. Apr 23 03:50:11 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f2f79885-bf1d-4bf5-9eb9-ec152a22fb51 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "7c1f1f24-62f5-4e51-8c74-b5b89b6585a4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.288s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:50:11 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6ee9a44e-e6af-4eec-975c-3991146ce71b/disk --force-share --output=json" returned: 0 in 0.160s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:50:11 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6ee9a44e-e6af-4eec-975c-3991146ce71b/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:50:11 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6ee9a44e-e6af-4eec-975c-3991146ce71b/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:50:11 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7c1f1f24-62f5-4e51-8c74-b5b89b6585a4/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:50:11 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7c1f1f24-62f5-4e51-8c74-b5b89b6585a4/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:50:11 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7c1f1f24-62f5-4e51-8c74-b5b89b6585a4/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:50:11 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7c1f1f24-62f5-4e51-8c74-b5b89b6585a4/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:50:11 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:50:11 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/disk --force-share --output=json" returned: 0 in 0.145s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:50:11 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:50:11 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/disk --force-share --output=json" returned: 0 in 0.158s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:50:11 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:50:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json" returned: 0 in 0.241s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:50:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:50:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json" returned: 0 in 0.153s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:50:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:50:12 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86/disk --force-share --output=json" returned: 0 in 0.155s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:50:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:50:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86/disk --force-share --output=json" returned: 0 in 0.162s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:50:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91756f90-733e-4aa5-9108-d2d8b1d020fe/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:50:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91756f90-733e-4aa5-9108-d2d8b1d020fe/disk --force-share --output=json" returned: 0 in 0.162s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:50:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91756f90-733e-4aa5-9108-d2d8b1d020fe/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:50:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91756f90-733e-4aa5-9108-d2d8b1d020fe/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:50:13 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:50:13 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:50:13 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Hypervisor/Node resource view: name=user free_ram=8362MB free_disk=26.26611328125GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71428) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 23 03:50:13 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:50:13 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:50:13 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 954f41b2-5c67-41a4-b38b-ebf3ad60cac7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:50:13 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 91756f90-733e-4aa5-9108-d2d8b1d020fe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:50:13 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance c8480a7c-b5de-4f66-a4f0-08fc679b0dfd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:50:13 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 6ee9a44e-e6af-4eec-975c-3991146ce71b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:50:13 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:50:13 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:50:13 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Total usable vcpus: 12, total allocated vcpus: 6 {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 23 03:50:13 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Final resource view: name=user phys_ram=16023MB used_ram=1280MB phys_disk=40GB used_disk=6GB total_vcpus=12 used_vcpus=6 pci_stats=[] {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 23 03:50:13 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:13 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:50:13 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:50:13 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Compute_service record updated for user:user {{(pid=71428) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 23 03:50:13 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.397s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:50:14 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:50:14 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Starting heal instance info cache {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 23 03:50:14 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Rebuilding the list of instances to heal {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 23 03:50:14 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "refresh_cache-954f41b2-5c67-41a4-b38b-ebf3ad60cac7" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:50:14 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquired lock "refresh_cache-954f41b2-5c67-41a4-b38b-ebf3ad60cac7" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:50:14 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Forcefully refreshing network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 23 03:50:14 user nova-compute[71428]: DEBUG nova.objects.instance [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lazy-loading 'info_cache' on Instance uuid 954f41b2-5c67-41a4-b38b-ebf3ad60cac7 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:50:15 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Updating instance_info_cache with network_info: [{"id": "97d945a1-b86e-4a6d-af52-23084b8eb175", "address": "fa:16:3e:ee:63:58", "network": {"id": "bc639aed-946f-458c-b08d-ce3037f5507c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-558421264-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6f44fe9ba07e4e2f925d1a8952356e04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap97d945a1-b8", "ovs_interfaceid": "97d945a1-b86e-4a6d-af52-23084b8eb175", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:50:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Releasing lock "refresh_cache-954f41b2-5c67-41a4-b38b-ebf3ad60cac7" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:50:15 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Updated the network info_cache for instance {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 23 03:50:15 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:50:15 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:50:15 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:50:15 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:50:15 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71428) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 23 03:50:16 user nova-compute[71428]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:50:16 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] VM Stopped (Lifecycle Event) Apr 23 03:50:16 user nova-compute[71428]: DEBUG nova.compute.manager [None req-b162b5ad-5cf9-4b73-98b4-147c1b60e45f None None] [instance: f279f3d3-581d-4d6f-924f-4104ec23832a] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:50:17 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Acquiring lock "90cce4ec-f48c-408f-8d8f-46414bde01df" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:50:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lock "90cce4ec-f48c-408f-8d8f-46414bde01df" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:50:21 user nova-compute[71428]: DEBUG nova.compute.manager [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Starting instance... {{(pid=71428) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 23 03:50:22 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:50:22 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:50:22 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71428) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 23 03:50:22 user nova-compute[71428]: INFO nova.compute.claims [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Claim successful on node user Apr 23 03:50:22 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:50:22 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:50:22 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:50:22 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:50:22 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71428) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 23 03:50:22 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 03:50:22 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 03:50:22 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:22 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.330s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:50:22 user nova-compute[71428]: DEBUG nova.compute.manager [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Start building networks asynchronously for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 23 03:50:22 user nova-compute[71428]: DEBUG nova.compute.manager [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Allocating IP information in the background. {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 23 03:50:22 user nova-compute[71428]: DEBUG nova.network.neutron [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] allocate_for_instance() {{(pid=71428) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 23 03:50:22 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 23 03:50:22 user nova-compute[71428]: DEBUG nova.compute.manager [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Start building block device mappings for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 23 03:50:22 user nova-compute[71428]: DEBUG nova.policy [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '93a8dbfd8cef4578aff742813ffe901e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8ab0f01751954d04a83b360b2f839716', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71428) authorize /opt/stack/nova/nova/policy.py:203}} Apr 23 03:50:22 user nova-compute[71428]: DEBUG nova.compute.manager [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Start spawning the instance on the hypervisor. {{(pid=71428) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 23 03:50:22 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Creating instance directory {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 23 03:50:22 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Creating image(s) Apr 23 03:50:22 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Acquiring lock "/opt/stack/data/nova/instances/90cce4ec-f48c-408f-8d8f-46414bde01df/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:50:22 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lock "/opt/stack/data/nova/instances/90cce4ec-f48c-408f-8d8f-46414bde01df/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:50:22 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lock "/opt/stack/data/nova/instances/90cce4ec-f48c-408f-8d8f-46414bde01df/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:50:22 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Acquiring lock "9ea37d45b90fb6f5b86dbb3a9ebe404149f0f222" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:50:22 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lock "9ea37d45b90fb6f5b86dbb3a9ebe404149f0f222" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:50:23 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/9ea37d45b90fb6f5b86dbb3a9ebe404149f0f222.part --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:50:23 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/9ea37d45b90fb6f5b86dbb3a9ebe404149f0f222.part --force-share --output=json" returned: 0 in 0.152s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:50:23 user nova-compute[71428]: DEBUG nova.virt.images [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] 9c1b8f6e-3455-4961-bbec-6a8a6708371c was qcow2, converting to raw {{(pid=71428) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Apr 23 03:50:23 user nova-compute[71428]: DEBUG nova.privsep.utils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71428) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 23 03:50:23 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/9ea37d45b90fb6f5b86dbb3a9ebe404149f0f222.part /opt/stack/data/nova/instances/_base/9ea37d45b90fb6f5b86dbb3a9ebe404149f0f222.converted {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:50:23 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/9ea37d45b90fb6f5b86dbb3a9ebe404149f0f222.part /opt/stack/data/nova/instances/_base/9ea37d45b90fb6f5b86dbb3a9ebe404149f0f222.converted" returned: 0 in 0.141s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:50:23 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/9ea37d45b90fb6f5b86dbb3a9ebe404149f0f222.converted --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:50:23 user nova-compute[71428]: DEBUG nova.network.neutron [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Successfully created port: c9e332a6-c2c7-41d3-9646-e5f672d8cb95 {{(pid=71428) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 23 03:50:23 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/9ea37d45b90fb6f5b86dbb3a9ebe404149f0f222.converted --force-share --output=json" returned: 0 in 0.165s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:50:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lock "9ea37d45b90fb6f5b86dbb3a9ebe404149f0f222" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.795s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:50:23 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/9ea37d45b90fb6f5b86dbb3a9ebe404149f0f222 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:50:23 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/9ea37d45b90fb6f5b86dbb3a9ebe404149f0f222 --force-share --output=json" returned: 0 in 0.154s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:50:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Acquiring lock "9ea37d45b90fb6f5b86dbb3a9ebe404149f0f222" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:50:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lock "9ea37d45b90fb6f5b86dbb3a9ebe404149f0f222" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:50:23 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/9ea37d45b90fb6f5b86dbb3a9ebe404149f0f222 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:50:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:23 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/9ea37d45b90fb6f5b86dbb3a9ebe404149f0f222 --force-share --output=json" returned: 0 in 0.147s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:50:23 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/9ea37d45b90fb6f5b86dbb3a9ebe404149f0f222,backing_fmt=raw /opt/stack/data/nova/instances/90cce4ec-f48c-408f-8d8f-46414bde01df/disk 1073741824 {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:50:23 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/9ea37d45b90fb6f5b86dbb3a9ebe404149f0f222,backing_fmt=raw /opt/stack/data/nova/instances/90cce4ec-f48c-408f-8d8f-46414bde01df/disk 1073741824" returned: 0 in 0.059s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:50:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lock "9ea37d45b90fb6f5b86dbb3a9ebe404149f0f222" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.213s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:50:23 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/9ea37d45b90fb6f5b86dbb3a9ebe404149f0f222 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/9ea37d45b90fb6f5b86dbb3a9ebe404149f0f222 --force-share --output=json" returned: 0 in 0.158s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Checking if we can resize image /opt/stack/data/nova/instances/90cce4ec-f48c-408f-8d8f-46414bde01df/disk. size=1073741824 {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/90cce4ec-f48c-408f-8d8f-46414bde01df/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG nova.network.neutron [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Successfully updated port: c9e332a6-c2c7-41d3-9646-e5f672d8cb95 {{(pid=71428) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Acquiring lock "refresh_cache-90cce4ec-f48c-408f-8d8f-46414bde01df" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Acquired lock "refresh_cache-90cce4ec-f48c-408f-8d8f-46414bde01df" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG nova.network.neutron [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Building network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/90cce4ec-f48c-408f-8d8f-46414bde01df/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Cannot resize image /opt/stack/data/nova/instances/90cce4ec-f48c-408f-8d8f-46414bde01df/disk to a smaller size. {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG nova.objects.instance [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lazy-loading 'migration_context' on Instance uuid 90cce4ec-f48c-408f-8d8f-46414bde01df {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Created local disks {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Ensure instance console log exists: /opt/stack/data/nova/instances/90cce4ec-f48c-408f-8d8f-46414bde01df/console.log {{(pid=71428) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG nova.compute.manager [req-7a036921-e3aa-423d-b38f-d8342fc228d0 req-c23c84ae-f6dc-4fa6-b68b-34ec7207925f service nova] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Received event network-changed-c9e332a6-c2c7-41d3-9646-e5f672d8cb95 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG nova.compute.manager [req-7a036921-e3aa-423d-b38f-d8342fc228d0 req-c23c84ae-f6dc-4fa6-b68b-34ec7207925f service nova] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Refreshing instance network info cache due to event network-changed-c9e332a6-c2c7-41d3-9646-e5f672d8cb95. {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-7a036921-e3aa-423d-b38f-d8342fc228d0 req-c23c84ae-f6dc-4fa6-b68b-34ec7207925f service nova] Acquiring lock "refresh_cache-90cce4ec-f48c-408f-8d8f-46414bde01df" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG nova.network.neutron [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Instance cache missing network info. {{(pid=71428) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG nova.network.neutron [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Updating instance_info_cache with network_info: [{"id": "c9e332a6-c2c7-41d3-9646-e5f672d8cb95", "address": "fa:16:3e:58:83:cd", "network": {"id": "a603a26f-3b06-4792-80ae-73cea9f9d53b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1385818779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8ab0f01751954d04a83b360b2f839716", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9e332a6-c2", "ovs_interfaceid": "c9e332a6-c2c7-41d3-9646-e5f672d8cb95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Releasing lock "refresh_cache-90cce4ec-f48c-408f-8d8f-46414bde01df" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG nova.compute.manager [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Instance network_info: |[{"id": "c9e332a6-c2c7-41d3-9646-e5f672d8cb95", "address": "fa:16:3e:58:83:cd", "network": {"id": "a603a26f-3b06-4792-80ae-73cea9f9d53b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1385818779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8ab0f01751954d04a83b360b2f839716", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9e332a6-c2", "ovs_interfaceid": "c9e332a6-c2c7-41d3-9646-e5f672d8cb95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-7a036921-e3aa-423d-b38f-d8342fc228d0 req-c23c84ae-f6dc-4fa6-b68b-34ec7207925f service nova] Acquired lock "refresh_cache-90cce4ec-f48c-408f-8d8f-46414bde01df" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG nova.network.neutron [req-7a036921-e3aa-423d-b38f-d8342fc228d0 req-c23c84ae-f6dc-4fa6-b68b-34ec7207925f service nova] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Refreshing network info cache for port c9e332a6-c2c7-41d3-9646-e5f672d8cb95 {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Start _get_guest_xml network_info=[{"id": "c9e332a6-c2c7-41d3-9646-e5f672d8cb95", "address": "fa:16:3e:58:83:cd", "network": {"id": "a603a26f-3b06-4792-80ae-73cea9f9d53b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1385818779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8ab0f01751954d04a83b360b2f839716", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9e332a6-c2", "ovs_interfaceid": "c9e332a6-c2c7-41d3-9646-e5f672d8cb95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:50:19Z,direct_url=,disk_format='qcow2',id=9c1b8f6e-3455-4961-bbec-6a8a6708371c,min_disk=0,min_ram=0,name='tempest-scenario-img--1585092168',owner='8ab0f01751954d04a83b360b2f839716',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:50:21Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'image_id': '9c1b8f6e-3455-4961-bbec-6a8a6708371c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 23 03:50:24 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:50:24 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:50:24 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71428) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-23T03:43:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:50:19Z,direct_url=,disk_format='qcow2',id=9c1b8f6e-3455-4961-bbec-6a8a6708371c,min_disk=0,min_ram=0,name='tempest-scenario-img--1585092168',owner='8ab0f01751954d04a83b360b2f839716',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:50:21Z,virtual_size=,visibility=), allow threads: True {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Flavor limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Image limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Flavor pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Image pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Got 1 possible topologies {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:50:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-524528732',display_name='tempest-TestMinimumBasicScenario-server-524528732',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-524528732',id=13,image_ref='9c1b8f6e-3455-4961-bbec-6a8a6708371c',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIAmBt4gctDSizuDDJrh4pQ0geBulERx8NzMYT+fgBg4gRMdKT3R/5EiARevhCiAiWq2K21pOaqKy/T04eohZfy3bE6LZ/4YHDUN1RqyVnppPqbYR1ciJUTaIPmbmDg0QQ==',key_name='tempest-TestMinimumBasicScenario-962162734',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8ab0f01751954d04a83b360b2f839716',ramdisk_id='',reservation_id='r-m79x3jny',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9c1b8f6e-3455-4961-bbec-6a8a6708371c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-1558592168',owner_user_name='tempest-TestMinimumBasicScenario-1558592168-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:50:23Z,user_data=None,user_id='93a8dbfd8cef4578aff742813ffe901e',uuid=90cce4ec-f48c-408f-8d8f-46414bde01df,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c9e332a6-c2c7-41d3-9646-e5f672d8cb95", "address": "fa:16:3e:58:83:cd", "network": {"id": "a603a26f-3b06-4792-80ae-73cea9f9d53b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1385818779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8ab0f01751954d04a83b360b2f839716", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9e332a6-c2", "ovs_interfaceid": "c9e332a6-c2c7-41d3-9646-e5f672d8cb95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71428) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Converting VIF {"id": "c9e332a6-c2c7-41d3-9646-e5f672d8cb95", "address": "fa:16:3e:58:83:cd", "network": {"id": "a603a26f-3b06-4792-80ae-73cea9f9d53b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1385818779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8ab0f01751954d04a83b360b2f839716", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9e332a6-c2", "ovs_interfaceid": "c9e332a6-c2c7-41d3-9646-e5f672d8cb95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:83:cd,bridge_name='br-int',has_traffic_filtering=True,id=c9e332a6-c2c7-41d3-9646-e5f672d8cb95,network=Network(a603a26f-3b06-4792-80ae-73cea9f9d53b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9e332a6-c2') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG nova.objects.instance [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lazy-loading 'pci_devices' on Instance uuid 90cce4ec-f48c-408f-8d8f-46414bde01df {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:50:24 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] End _get_guest_xml xml= Apr 23 03:50:24 user nova-compute[71428]: 90cce4ec-f48c-408f-8d8f-46414bde01df Apr 23 03:50:24 user nova-compute[71428]: instance-0000000d Apr 23 03:50:24 user nova-compute[71428]: 131072 Apr 23 03:50:24 user nova-compute[71428]: 1 Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: tempest-TestMinimumBasicScenario-server-524528732 Apr 23 03:50:24 user nova-compute[71428]: 2023-04-23 03:50:24 Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: 128 Apr 23 03:50:24 user nova-compute[71428]: 1 Apr 23 03:50:24 user nova-compute[71428]: 0 Apr 23 03:50:24 user nova-compute[71428]: 0 Apr 23 03:50:24 user nova-compute[71428]: 1 Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: tempest-TestMinimumBasicScenario-1558592168-project-member Apr 23 03:50:24 user nova-compute[71428]: tempest-TestMinimumBasicScenario-1558592168 Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: OpenStack Foundation Apr 23 03:50:24 user nova-compute[71428]: OpenStack Nova Apr 23 03:50:24 user nova-compute[71428]: 0.0.0 Apr 23 03:50:24 user nova-compute[71428]: 90cce4ec-f48c-408f-8d8f-46414bde01df Apr 23 03:50:24 user nova-compute[71428]: 90cce4ec-f48c-408f-8d8f-46414bde01df Apr 23 03:50:24 user nova-compute[71428]: Virtual Machine Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: hvm Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Nehalem Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: /dev/urandom Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: Apr 23 03:50:24 user nova-compute[71428]: {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 23 03:50:25 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:50:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-524528732',display_name='tempest-TestMinimumBasicScenario-server-524528732',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-524528732',id=13,image_ref='9c1b8f6e-3455-4961-bbec-6a8a6708371c',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIAmBt4gctDSizuDDJrh4pQ0geBulERx8NzMYT+fgBg4gRMdKT3R/5EiARevhCiAiWq2K21pOaqKy/T04eohZfy3bE6LZ/4YHDUN1RqyVnppPqbYR1ciJUTaIPmbmDg0QQ==',key_name='tempest-TestMinimumBasicScenario-962162734',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8ab0f01751954d04a83b360b2f839716',ramdisk_id='',reservation_id='r-m79x3jny',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9c1b8f6e-3455-4961-bbec-6a8a6708371c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-1558592168',owner_user_name='tempest-TestMinimumBasicScenario-1558592168-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:50:23Z,user_data=None,user_id='93a8dbfd8cef4578aff742813ffe901e',uuid=90cce4ec-f48c-408f-8d8f-46414bde01df,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c9e332a6-c2c7-41d3-9646-e5f672d8cb95", "address": "fa:16:3e:58:83:cd", "network": {"id": "a603a26f-3b06-4792-80ae-73cea9f9d53b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1385818779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8ab0f01751954d04a83b360b2f839716", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9e332a6-c2", "ovs_interfaceid": "c9e332a6-c2c7-41d3-9646-e5f672d8cb95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 23 03:50:25 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Converting VIF {"id": "c9e332a6-c2c7-41d3-9646-e5f672d8cb95", "address": "fa:16:3e:58:83:cd", "network": {"id": "a603a26f-3b06-4792-80ae-73cea9f9d53b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1385818779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8ab0f01751954d04a83b360b2f839716", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9e332a6-c2", "ovs_interfaceid": "c9e332a6-c2c7-41d3-9646-e5f672d8cb95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:50:25 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:83:cd,bridge_name='br-int',has_traffic_filtering=True,id=c9e332a6-c2c7-41d3-9646-e5f672d8cb95,network=Network(a603a26f-3b06-4792-80ae-73cea9f9d53b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9e332a6-c2') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:50:25 user nova-compute[71428]: DEBUG os_vif [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:83:cd,bridge_name='br-int',has_traffic_filtering=True,id=c9e332a6-c2c7-41d3-9646-e5f672d8cb95,network=Network(a603a26f-3b06-4792-80ae-73cea9f9d53b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9e332a6-c2') {{(pid=71428) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 23 03:50:25 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:25 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:50:25 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 23 03:50:25 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:25 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc9e332a6-c2, may_exist=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:50:25 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc9e332a6-c2, col_values=(('external_ids', {'iface-id': 'c9e332a6-c2c7-41d3-9646-e5f672d8cb95', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:58:83:cd', 'vm-uuid': '90cce4ec-f48c-408f-8d8f-46414bde01df'}),)) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:50:25 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:25 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:50:25 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:25 user nova-compute[71428]: INFO os_vif [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:83:cd,bridge_name='br-int',has_traffic_filtering=True,id=c9e332a6-c2c7-41d3-9646-e5f672d8cb95,network=Network(a603a26f-3b06-4792-80ae-73cea9f9d53b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9e332a6-c2') Apr 23 03:50:25 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] No BDM found with device name vda, not building metadata. {{(pid=71428) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 23 03:50:25 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] No VIF found with MAC fa:16:3e:58:83:cd, not building metadata {{(pid=71428) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 23 03:50:25 user nova-compute[71428]: DEBUG nova.network.neutron [req-7a036921-e3aa-423d-b38f-d8342fc228d0 req-c23c84ae-f6dc-4fa6-b68b-34ec7207925f service nova] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Updated VIF entry in instance network info cache for port c9e332a6-c2c7-41d3-9646-e5f672d8cb95. {{(pid=71428) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 23 03:50:25 user nova-compute[71428]: DEBUG nova.network.neutron [req-7a036921-e3aa-423d-b38f-d8342fc228d0 req-c23c84ae-f6dc-4fa6-b68b-34ec7207925f service nova] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Updating instance_info_cache with network_info: [{"id": "c9e332a6-c2c7-41d3-9646-e5f672d8cb95", "address": "fa:16:3e:58:83:cd", "network": {"id": "a603a26f-3b06-4792-80ae-73cea9f9d53b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1385818779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8ab0f01751954d04a83b360b2f839716", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9e332a6-c2", "ovs_interfaceid": "c9e332a6-c2c7-41d3-9646-e5f672d8cb95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:50:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-7a036921-e3aa-423d-b38f-d8342fc228d0 req-c23c84ae-f6dc-4fa6-b68b-34ec7207925f service nova] Releasing lock "refresh_cache-90cce4ec-f48c-408f-8d8f-46414bde01df" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:50:26 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:26 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:26 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:26 user nova-compute[71428]: DEBUG nova.compute.manager [req-029bd62a-173a-4f64-ad59-339e3e934bc1 req-e2a78a0b-7a45-47e9-b877-0bea9be271d2 service nova] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Received event network-vif-plugged-c9e332a6-c2c7-41d3-9646-e5f672d8cb95 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:50:26 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-029bd62a-173a-4f64-ad59-339e3e934bc1 req-e2a78a0b-7a45-47e9-b877-0bea9be271d2 service nova] Acquiring lock "90cce4ec-f48c-408f-8d8f-46414bde01df-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:50:26 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-029bd62a-173a-4f64-ad59-339e3e934bc1 req-e2a78a0b-7a45-47e9-b877-0bea9be271d2 service nova] Lock "90cce4ec-f48c-408f-8d8f-46414bde01df-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:50:26 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-029bd62a-173a-4f64-ad59-339e3e934bc1 req-e2a78a0b-7a45-47e9-b877-0bea9be271d2 service nova] Lock "90cce4ec-f48c-408f-8d8f-46414bde01df-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:50:26 user nova-compute[71428]: DEBUG nova.compute.manager [req-029bd62a-173a-4f64-ad59-339e3e934bc1 req-e2a78a0b-7a45-47e9-b877-0bea9be271d2 service nova] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] No waiting events found dispatching network-vif-plugged-c9e332a6-c2c7-41d3-9646-e5f672d8cb95 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:50:26 user nova-compute[71428]: WARNING nova.compute.manager [req-029bd62a-173a-4f64-ad59-339e3e934bc1 req-e2a78a0b-7a45-47e9-b877-0bea9be271d2 service nova] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Received unexpected event network-vif-plugged-c9e332a6-c2c7-41d3-9646-e5f672d8cb95 for instance with vm_state building and task_state spawning. Apr 23 03:50:26 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:26 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:26 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:28 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Resumed> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:50:28 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] VM Resumed (Lifecycle Event) Apr 23 03:50:28 user nova-compute[71428]: DEBUG nova.compute.manager [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Instance event wait completed in 0 seconds for {{(pid=71428) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 23 03:50:28 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Guest created on hypervisor {{(pid=71428) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 23 03:50:28 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Instance spawned successfully. Apr 23 03:50:28 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 23 03:50:28 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:50:28 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:50:28 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Found default for hw_cdrom_bus of ide {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:50:28 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Found default for hw_disk_bus of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:50:28 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Found default for hw_input_bus of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:50:28 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Found default for hw_pointer_model of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:50:28 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Found default for hw_video_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:50:28 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Found default for hw_vif_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:50:28 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 03:50:28 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Started> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:50:28 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] VM Started (Lifecycle Event) Apr 23 03:50:28 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:50:28 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:50:28 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 03:50:28 user nova-compute[71428]: INFO nova.compute.manager [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Took 5.82 seconds to spawn the instance on the hypervisor. Apr 23 03:50:28 user nova-compute[71428]: DEBUG nova.compute.manager [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:50:28 user nova-compute[71428]: DEBUG nova.compute.manager [req-02947cbb-5d2c-487b-a46e-a5b8d24ee39f req-5737e729-5163-4ba2-a4a0-2b71bd0111e3 service nova] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Received event network-vif-plugged-c9e332a6-c2c7-41d3-9646-e5f672d8cb95 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:50:28 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-02947cbb-5d2c-487b-a46e-a5b8d24ee39f req-5737e729-5163-4ba2-a4a0-2b71bd0111e3 service nova] Acquiring lock "90cce4ec-f48c-408f-8d8f-46414bde01df-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:50:28 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-02947cbb-5d2c-487b-a46e-a5b8d24ee39f req-5737e729-5163-4ba2-a4a0-2b71bd0111e3 service nova] Lock "90cce4ec-f48c-408f-8d8f-46414bde01df-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:50:28 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-02947cbb-5d2c-487b-a46e-a5b8d24ee39f req-5737e729-5163-4ba2-a4a0-2b71bd0111e3 service nova] Lock "90cce4ec-f48c-408f-8d8f-46414bde01df-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:50:28 user nova-compute[71428]: DEBUG nova.compute.manager [req-02947cbb-5d2c-487b-a46e-a5b8d24ee39f req-5737e729-5163-4ba2-a4a0-2b71bd0111e3 service nova] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] No waiting events found dispatching network-vif-plugged-c9e332a6-c2c7-41d3-9646-e5f672d8cb95 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:50:28 user nova-compute[71428]: WARNING nova.compute.manager [req-02947cbb-5d2c-487b-a46e-a5b8d24ee39f req-5737e729-5163-4ba2-a4a0-2b71bd0111e3 service nova] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Received unexpected event network-vif-plugged-c9e332a6-c2c7-41d3-9646-e5f672d8cb95 for instance with vm_state building and task_state spawning. Apr 23 03:50:28 user nova-compute[71428]: INFO nova.compute.manager [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Took 6.66 seconds to build instance. Apr 23 03:50:28 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:28 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-04800526-02b9-47fc-9ba4-542ab18461b2 tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lock "90cce4ec-f48c-408f-8d8f-46414bde01df" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.760s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:50:30 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:33 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:35 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:40 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:43 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:45 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:48 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:50 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:53 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:54 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:55 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:50:58 user nova-compute[71428]: DEBUG nova.compute.manager [None req-a247a50c-ed92-4a55-b891-807e4a0481bb tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:50:58 user nova-compute[71428]: INFO nova.compute.manager [None req-a247a50c-ed92-4a55-b891-807e4a0481bb tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] instance snapshotting Apr 23 03:50:59 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-a247a50c-ed92-4a55-b891-807e4a0481bb tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Beginning live snapshot process Apr 23 03:50:59 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-a247a50c-ed92-4a55-b891-807e4a0481bb tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json -f qcow2 {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:50:59 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-a247a50c-ed92-4a55-b891-807e4a0481bb tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json -f qcow2" returned: 0 in 0.138s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:50:59 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-a247a50c-ed92-4a55-b891-807e4a0481bb tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json -f qcow2 {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:50:59 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-a247a50c-ed92-4a55-b891-807e4a0481bb tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json -f qcow2" returned: 0 in 0.133s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:50:59 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-a247a50c-ed92-4a55-b891-807e4a0481bb tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:50:59 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-a247a50c-ed92-4a55-b891-807e4a0481bb tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.133s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:50:59 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-a247a50c-ed92-4a55-b891-807e4a0481bb tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmp4e2pm0qx/59682707b91b4ab28344b71b97b0bc44.delta 1073741824 {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:50:59 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-a247a50c-ed92-4a55-b891-807e4a0481bb tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmp4e2pm0qx/59682707b91b4ab28344b71b97b0bc44.delta 1073741824" returned: 0 in 0.047s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:50:59 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-a247a50c-ed92-4a55-b891-807e4a0481bb tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Quiescing instance not available: QEMU guest agent is not enabled. Apr 23 03:51:00 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:00 user nova-compute[71428]: DEBUG nova.virt.libvirt.guest [None req-a247a50c-ed92-4a55-b891-807e4a0481bb tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] COPY block job progress, current cursor: 0 final cursor: 43778048 {{(pid=71428) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 23 03:51:00 user nova-compute[71428]: DEBUG nova.virt.libvirt.guest [None req-a247a50c-ed92-4a55-b891-807e4a0481bb tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] COPY block job progress, current cursor: 43778048 final cursor: 43778048 {{(pid=71428) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 23 03:51:00 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-a247a50c-ed92-4a55-b891-807e4a0481bb tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Skipping quiescing instance: QEMU guest agent is not enabled. Apr 23 03:51:01 user nova-compute[71428]: DEBUG nova.privsep.utils [None req-a247a50c-ed92-4a55-b891-807e4a0481bb tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71428) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 23 03:51:01 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-a247a50c-ed92-4a55-b891-807e4a0481bb tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmp4e2pm0qx/59682707b91b4ab28344b71b97b0bc44.delta /opt/stack/data/nova/instances/snapshots/tmp4e2pm0qx/59682707b91b4ab28344b71b97b0bc44 {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:51:01 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-a247a50c-ed92-4a55-b891-807e4a0481bb tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmp4e2pm0qx/59682707b91b4ab28344b71b97b0bc44.delta /opt/stack/data/nova/instances/snapshots/tmp4e2pm0qx/59682707b91b4ab28344b71b97b0bc44" returned: 0 in 0.603s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:51:01 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-a247a50c-ed92-4a55-b891-807e4a0481bb tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Snapshot extracted, beginning image upload Apr 23 03:51:01 user nova-compute[71428]: DEBUG nova.compute.manager [req-bcef500c-c183-48f1-ac94-c14cb0e1bbf9 req-823e9526-3837-4a65-b1d6-9b1b45b4271a service nova] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Received event network-changed-aaf8a702-e47a-4fc7-a9a8-3a7e63525294 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:51:01 user nova-compute[71428]: DEBUG nova.compute.manager [req-bcef500c-c183-48f1-ac94-c14cb0e1bbf9 req-823e9526-3837-4a65-b1d6-9b1b45b4271a service nova] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Refreshing instance network info cache due to event network-changed-aaf8a702-e47a-4fc7-a9a8-3a7e63525294. {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 23 03:51:01 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-bcef500c-c183-48f1-ac94-c14cb0e1bbf9 req-823e9526-3837-4a65-b1d6-9b1b45b4271a service nova] Acquiring lock "refresh_cache-6ee9a44e-e6af-4eec-975c-3991146ce71b" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:51:01 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-bcef500c-c183-48f1-ac94-c14cb0e1bbf9 req-823e9526-3837-4a65-b1d6-9b1b45b4271a service nova] Acquired lock "refresh_cache-6ee9a44e-e6af-4eec-975c-3991146ce71b" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:51:01 user nova-compute[71428]: DEBUG nova.network.neutron [req-bcef500c-c183-48f1-ac94-c14cb0e1bbf9 req-823e9526-3837-4a65-b1d6-9b1b45b4271a service nova] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Refreshing network info cache for port aaf8a702-e47a-4fc7-a9a8-3a7e63525294 {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 23 03:51:02 user nova-compute[71428]: DEBUG nova.network.neutron [req-bcef500c-c183-48f1-ac94-c14cb0e1bbf9 req-823e9526-3837-4a65-b1d6-9b1b45b4271a service nova] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Updated VIF entry in instance network info cache for port aaf8a702-e47a-4fc7-a9a8-3a7e63525294. {{(pid=71428) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 23 03:51:02 user nova-compute[71428]: DEBUG nova.network.neutron [req-bcef500c-c183-48f1-ac94-c14cb0e1bbf9 req-823e9526-3837-4a65-b1d6-9b1b45b4271a service nova] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Updating instance_info_cache with network_info: [{"id": "aaf8a702-e47a-4fc7-a9a8-3a7e63525294", "address": "fa:16:3e:a3:c1:eb", "network": {"id": "5604c946-dd68-4517-9c39-b4d33d86a345", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1424127131-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.67", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70b031ddc5c94ca98e7161de03bda4b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapaaf8a702-e4", "ovs_interfaceid": "aaf8a702-e47a-4fc7-a9a8-3a7e63525294", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:51:02 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-bcef500c-c183-48f1-ac94-c14cb0e1bbf9 req-823e9526-3837-4a65-b1d6-9b1b45b4271a service nova] Releasing lock "refresh_cache-6ee9a44e-e6af-4eec-975c-3991146ce71b" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:51:02 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-795e6b08-8d8c-4675-95f6-1196ad88bdac tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Acquiring lock "6ee9a44e-e6af-4eec-975c-3991146ce71b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:51:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-795e6b08-8d8c-4675-95f6-1196ad88bdac tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lock "6ee9a44e-e6af-4eec-975c-3991146ce71b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:51:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-795e6b08-8d8c-4675-95f6-1196ad88bdac tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Acquiring lock "6ee9a44e-e6af-4eec-975c-3991146ce71b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:51:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-795e6b08-8d8c-4675-95f6-1196ad88bdac tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lock "6ee9a44e-e6af-4eec-975c-3991146ce71b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:51:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-795e6b08-8d8c-4675-95f6-1196ad88bdac tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lock "6ee9a44e-e6af-4eec-975c-3991146ce71b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:51:03 user nova-compute[71428]: INFO nova.compute.manager [None req-795e6b08-8d8c-4675-95f6-1196ad88bdac tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Terminating instance Apr 23 03:51:03 user nova-compute[71428]: DEBUG nova.compute.manager [None req-795e6b08-8d8c-4675-95f6-1196ad88bdac tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Start destroying the instance on the hypervisor. {{(pid=71428) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 23 03:51:03 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:03 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:03 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:03 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:03 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:03 user nova-compute[71428]: DEBUG nova.compute.manager [req-b3c12500-5d16-4694-8521-4466545cc460 req-2ffd29c7-471e-4063-8616-c9ea6fca7a8d service nova] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Received event network-vif-unplugged-aaf8a702-e47a-4fc7-a9a8-3a7e63525294 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:51:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-b3c12500-5d16-4694-8521-4466545cc460 req-2ffd29c7-471e-4063-8616-c9ea6fca7a8d service nova] Acquiring lock "6ee9a44e-e6af-4eec-975c-3991146ce71b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:51:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-b3c12500-5d16-4694-8521-4466545cc460 req-2ffd29c7-471e-4063-8616-c9ea6fca7a8d service nova] Lock "6ee9a44e-e6af-4eec-975c-3991146ce71b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:51:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-b3c12500-5d16-4694-8521-4466545cc460 req-2ffd29c7-471e-4063-8616-c9ea6fca7a8d service nova] Lock "6ee9a44e-e6af-4eec-975c-3991146ce71b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:51:03 user nova-compute[71428]: DEBUG nova.compute.manager [req-b3c12500-5d16-4694-8521-4466545cc460 req-2ffd29c7-471e-4063-8616-c9ea6fca7a8d service nova] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] No waiting events found dispatching network-vif-unplugged-aaf8a702-e47a-4fc7-a9a8-3a7e63525294 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:51:03 user nova-compute[71428]: DEBUG nova.compute.manager [req-b3c12500-5d16-4694-8521-4466545cc460 req-2ffd29c7-471e-4063-8616-c9ea6fca7a8d service nova] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Received event network-vif-unplugged-aaf8a702-e47a-4fc7-a9a8-3a7e63525294 for instance with task_state deleting. {{(pid=71428) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 23 03:51:04 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:04 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:04 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Instance destroyed successfully. Apr 23 03:51:04 user nova-compute[71428]: DEBUG nova.objects.instance [None req-795e6b08-8d8c-4675-95f6-1196ad88bdac tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lazy-loading 'resources' on Instance uuid 6ee9a44e-e6af-4eec-975c-3991146ce71b {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:51:04 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-795e6b08-8d8c-4675-95f6-1196ad88bdac tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:49:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-95239721',display_name='tempest-AttachVolumeTestJSON-server-95239721',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-95239721',id=10,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKEbelubDhWYxOcMjX0yw/rKDeobxDpM94ii4rSjeoKkqXPZBE/gk0LSlgoNggx8xKW/qBeJcmHECpzHOVf8zkPxD/oth6aCuNZeiupH6KLcLo/umHtjWJDWCvB+mcrBEQ==',key_name='tempest-keypair-1779422949',keypairs=,launch_index=0,launched_at=2023-04-23T03:49:19Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='70b031ddc5c94ca98e7161de03bda4b7',ramdisk_id='',reservation_id='r-sdoft0x0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeTestJSON-276721084',owner_user_name='tempest-AttachVolumeTestJSON-276721084-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-23T03:49:20Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e3d689f1c160478ca83bbff3104d8ec3',uuid=6ee9a44e-e6af-4eec-975c-3991146ce71b,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "aaf8a702-e47a-4fc7-a9a8-3a7e63525294", "address": "fa:16:3e:a3:c1:eb", "network": {"id": "5604c946-dd68-4517-9c39-b4d33d86a345", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1424127131-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.67", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70b031ddc5c94ca98e7161de03bda4b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapaaf8a702-e4", "ovs_interfaceid": "aaf8a702-e47a-4fc7-a9a8-3a7e63525294", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 23 03:51:04 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-795e6b08-8d8c-4675-95f6-1196ad88bdac tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Converting VIF {"id": "aaf8a702-e47a-4fc7-a9a8-3a7e63525294", "address": "fa:16:3e:a3:c1:eb", "network": {"id": "5604c946-dd68-4517-9c39-b4d33d86a345", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1424127131-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.67", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "70b031ddc5c94ca98e7161de03bda4b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapaaf8a702-e4", "ovs_interfaceid": "aaf8a702-e47a-4fc7-a9a8-3a7e63525294", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:51:04 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-795e6b08-8d8c-4675-95f6-1196ad88bdac tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a3:c1:eb,bridge_name='br-int',has_traffic_filtering=True,id=aaf8a702-e47a-4fc7-a9a8-3a7e63525294,network=Network(5604c946-dd68-4517-9c39-b4d33d86a345),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaaf8a702-e4') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:51:04 user nova-compute[71428]: DEBUG os_vif [None req-795e6b08-8d8c-4675-95f6-1196ad88bdac tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a3:c1:eb,bridge_name='br-int',has_traffic_filtering=True,id=aaf8a702-e47a-4fc7-a9a8-3a7e63525294,network=Network(5604c946-dd68-4517-9c39-b4d33d86a345),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaaf8a702-e4') {{(pid=71428) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 23 03:51:04 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-a247a50c-ed92-4a55-b891-807e4a0481bb tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Snapshot image upload complete Apr 23 03:51:04 user nova-compute[71428]: INFO nova.compute.manager [None req-a247a50c-ed92-4a55-b891-807e4a0481bb tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Took 5.29 seconds to snapshot the instance on the hypervisor. Apr 23 03:51:04 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:04 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaaf8a702-e4, bridge=br-int, if_exists=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:51:04 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:04 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:51:04 user nova-compute[71428]: INFO os_vif [None req-795e6b08-8d8c-4675-95f6-1196ad88bdac tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a3:c1:eb,bridge_name='br-int',has_traffic_filtering=True,id=aaf8a702-e47a-4fc7-a9a8-3a7e63525294,network=Network(5604c946-dd68-4517-9c39-b4d33d86a345),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaaf8a702-e4') Apr 23 03:51:04 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-795e6b08-8d8c-4675-95f6-1196ad88bdac tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Deleting instance files /opt/stack/data/nova/instances/6ee9a44e-e6af-4eec-975c-3991146ce71b_del Apr 23 03:51:04 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-795e6b08-8d8c-4675-95f6-1196ad88bdac tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Deletion of /opt/stack/data/nova/instances/6ee9a44e-e6af-4eec-975c-3991146ce71b_del complete Apr 23 03:51:04 user nova-compute[71428]: INFO nova.compute.manager [None req-795e6b08-8d8c-4675-95f6-1196ad88bdac tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Took 0.83 seconds to destroy the instance on the hypervisor. Apr 23 03:51:04 user nova-compute[71428]: DEBUG oslo.service.loopingcall [None req-795e6b08-8d8c-4675-95f6-1196ad88bdac tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71428) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 23 03:51:04 user nova-compute[71428]: DEBUG nova.compute.manager [-] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Deallocating network for instance {{(pid=71428) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 23 03:51:04 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] deallocate_for_instance() {{(pid=71428) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 23 03:51:05 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Updating instance_info_cache with network_info: [] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:51:05 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Took 0.81 seconds to deallocate network for instance. Apr 23 03:51:05 user nova-compute[71428]: DEBUG nova.compute.manager [req-063b1c65-d56b-438d-b8d3-0476782664f6 req-a99ccd16-7bf0-4d5e-bfa9-04dfd90616e5 service nova] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Received event network-vif-deleted-aaf8a702-e47a-4fc7-a9a8-3a7e63525294 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:51:05 user nova-compute[71428]: INFO nova.compute.manager [req-063b1c65-d56b-438d-b8d3-0476782664f6 req-a99ccd16-7bf0-4d5e-bfa9-04dfd90616e5 service nova] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Neutron deleted interface aaf8a702-e47a-4fc7-a9a8-3a7e63525294; detaching it from the instance and deleting it from the info cache Apr 23 03:51:05 user nova-compute[71428]: DEBUG nova.network.neutron [req-063b1c65-d56b-438d-b8d3-0476782664f6 req-a99ccd16-7bf0-4d5e-bfa9-04dfd90616e5 service nova] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Updating instance_info_cache with network_info: [] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:51:05 user nova-compute[71428]: DEBUG nova.compute.manager [req-063b1c65-d56b-438d-b8d3-0476782664f6 req-a99ccd16-7bf0-4d5e-bfa9-04dfd90616e5 service nova] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Detach interface failed, port_id=aaf8a702-e47a-4fc7-a9a8-3a7e63525294, reason: Instance 6ee9a44e-e6af-4eec-975c-3991146ce71b could not be found. {{(pid=71428) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 23 03:51:05 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-795e6b08-8d8c-4675-95f6-1196ad88bdac tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:51:05 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-795e6b08-8d8c-4675-95f6-1196ad88bdac tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:51:05 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-795e6b08-8d8c-4675-95f6-1196ad88bdac tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:51:05 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-795e6b08-8d8c-4675-95f6-1196ad88bdac tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:51:05 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-795e6b08-8d8c-4675-95f6-1196ad88bdac tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.282s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:51:05 user nova-compute[71428]: INFO nova.scheduler.client.report [None req-795e6b08-8d8c-4675-95f6-1196ad88bdac tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Deleted allocations for instance 6ee9a44e-e6af-4eec-975c-3991146ce71b Apr 23 03:51:05 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-795e6b08-8d8c-4675-95f6-1196ad88bdac tempest-AttachVolumeTestJSON-276721084 tempest-AttachVolumeTestJSON-276721084-project-member] Lock "6ee9a44e-e6af-4eec-975c-3991146ce71b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.136s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:51:05 user nova-compute[71428]: DEBUG nova.compute.manager [req-3c11e641-063c-47e0-8477-e1daaf312f02 req-56fb81c9-0122-4d35-abdb-484c3413d397 service nova] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Received event network-vif-plugged-aaf8a702-e47a-4fc7-a9a8-3a7e63525294 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:51:05 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-3c11e641-063c-47e0-8477-e1daaf312f02 req-56fb81c9-0122-4d35-abdb-484c3413d397 service nova] Acquiring lock "6ee9a44e-e6af-4eec-975c-3991146ce71b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:51:05 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-3c11e641-063c-47e0-8477-e1daaf312f02 req-56fb81c9-0122-4d35-abdb-484c3413d397 service nova] Lock "6ee9a44e-e6af-4eec-975c-3991146ce71b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:51:05 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-3c11e641-063c-47e0-8477-e1daaf312f02 req-56fb81c9-0122-4d35-abdb-484c3413d397 service nova] Lock "6ee9a44e-e6af-4eec-975c-3991146ce71b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:51:05 user nova-compute[71428]: DEBUG nova.compute.manager [req-3c11e641-063c-47e0-8477-e1daaf312f02 req-56fb81c9-0122-4d35-abdb-484c3413d397 service nova] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] No waiting events found dispatching network-vif-plugged-aaf8a702-e47a-4fc7-a9a8-3a7e63525294 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:51:05 user nova-compute[71428]: WARNING nova.compute.manager [req-3c11e641-063c-47e0-8477-e1daaf312f02 req-56fb81c9-0122-4d35-abdb-484c3413d397 service nova] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Received unexpected event network-vif-plugged-aaf8a702-e47a-4fc7-a9a8-3a7e63525294 for instance with vm_state deleted and task_state None. Apr 23 03:51:06 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:51:08 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:51:08 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:09 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:10 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:51:10 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6ba214cf-e85a-4b47-a6fd-39f3627fc565 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Acquiring lock "1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:51:10 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6ba214cf-e85a-4b47-a6fd-39f3627fc565 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lock "1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:51:10 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6ba214cf-e85a-4b47-a6fd-39f3627fc565 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Acquiring lock "1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:51:10 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6ba214cf-e85a-4b47-a6fd-39f3627fc565 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lock "1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:51:10 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6ba214cf-e85a-4b47-a6fd-39f3627fc565 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lock "1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:51:10 user nova-compute[71428]: INFO nova.compute.manager [None req-6ba214cf-e85a-4b47-a6fd-39f3627fc565 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Terminating instance Apr 23 03:51:10 user nova-compute[71428]: DEBUG nova.compute.manager [None req-6ba214cf-e85a-4b47-a6fd-39f3627fc565 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Start destroying the instance on the hypervisor. {{(pid=71428) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 23 03:51:11 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:11 user nova-compute[71428]: DEBUG nova.compute.manager [req-c0a88ff6-3bf9-4c54-9732-2cf09374de31 req-e1c9ad1d-1f94-46ae-b73c-7a95f9767245 service nova] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Received event network-vif-unplugged-dbcb6234-6b80-49f9-b7a1-0f339da7084a {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:51:11 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-c0a88ff6-3bf9-4c54-9732-2cf09374de31 req-e1c9ad1d-1f94-46ae-b73c-7a95f9767245 service nova] Acquiring lock "1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:51:11 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-c0a88ff6-3bf9-4c54-9732-2cf09374de31 req-e1c9ad1d-1f94-46ae-b73c-7a95f9767245 service nova] Lock "1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:51:11 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-c0a88ff6-3bf9-4c54-9732-2cf09374de31 req-e1c9ad1d-1f94-46ae-b73c-7a95f9767245 service nova] Lock "1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:51:11 user nova-compute[71428]: DEBUG nova.compute.manager [req-c0a88ff6-3bf9-4c54-9732-2cf09374de31 req-e1c9ad1d-1f94-46ae-b73c-7a95f9767245 service nova] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] No waiting events found dispatching network-vif-unplugged-dbcb6234-6b80-49f9-b7a1-0f339da7084a {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:51:11 user nova-compute[71428]: DEBUG nova.compute.manager [req-c0a88ff6-3bf9-4c54-9732-2cf09374de31 req-e1c9ad1d-1f94-46ae-b73c-7a95f9767245 service nova] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Received event network-vif-unplugged-dbcb6234-6b80-49f9-b7a1-0f339da7084a for instance with task_state deleting. {{(pid=71428) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 23 03:51:11 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:11 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:11 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:51:11 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:51:11 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:51:11 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:51:11 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Auditing locally available compute resources for user (node: user) {{(pid=71428) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 23 03:51:11 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Instance destroyed successfully. Apr 23 03:51:11 user nova-compute[71428]: DEBUG nova.objects.instance [None req-6ba214cf-e85a-4b47-a6fd-39f3627fc565 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lazy-loading 'resources' on Instance uuid 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:51:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-6ba214cf-e85a-4b47-a6fd-39f3627fc565 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:49:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1265143385',display_name='tempest-VolumesAdminNegativeTest-server-1265143385',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-1265143385',id=11,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-23T03:49:25Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='5eb0a03655cf4aa78e27c81ea4e1c424',ramdisk_id='',reservation_id='r-ttjiy9bj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-VolumesAdminNegativeTest-594073230',owner_user_name='tempest-VolumesAdminNegativeTest-594073230-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-23T03:49:26Z,user_data=None,user_id='3e80c354abd34bd3a28ddaeec9535af2',uuid=1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dbcb6234-6b80-49f9-b7a1-0f339da7084a", "address": "fa:16:3e:eb:b4:74", "network": {"id": "dca689d5-53e5-466b-bd5c-4c29b330d27e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2008984306-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5eb0a03655cf4aa78e27c81ea4e1c424", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdbcb6234-6b", "ovs_interfaceid": "dbcb6234-6b80-49f9-b7a1-0f339da7084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 23 03:51:11 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-6ba214cf-e85a-4b47-a6fd-39f3627fc565 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Converting VIF {"id": "dbcb6234-6b80-49f9-b7a1-0f339da7084a", "address": "fa:16:3e:eb:b4:74", "network": {"id": "dca689d5-53e5-466b-bd5c-4c29b330d27e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2008984306-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5eb0a03655cf4aa78e27c81ea4e1c424", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapdbcb6234-6b", "ovs_interfaceid": "dbcb6234-6b80-49f9-b7a1-0f339da7084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:51:11 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-6ba214cf-e85a-4b47-a6fd-39f3627fc565 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:b4:74,bridge_name='br-int',has_traffic_filtering=True,id=dbcb6234-6b80-49f9-b7a1-0f339da7084a,network=Network(dca689d5-53e5-466b-bd5c-4c29b330d27e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdbcb6234-6b') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:51:11 user nova-compute[71428]: DEBUG os_vif [None req-6ba214cf-e85a-4b47-a6fd-39f3627fc565 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:b4:74,bridge_name='br-int',has_traffic_filtering=True,id=dbcb6234-6b80-49f9-b7a1-0f339da7084a,network=Network(dca689d5-53e5-466b-bd5c-4c29b330d27e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdbcb6234-6b') {{(pid=71428) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 23 03:51:11 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:11 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdbcb6234-6b, bridge=br-int, if_exists=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:51:11 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:11 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:51:11 user nova-compute[71428]: INFO os_vif [None req-6ba214cf-e85a-4b47-a6fd-39f3627fc565 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:b4:74,bridge_name='br-int',has_traffic_filtering=True,id=dbcb6234-6b80-49f9-b7a1-0f339da7084a,network=Network(dca689d5-53e5-466b-bd5c-4c29b330d27e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdbcb6234-6b') Apr 23 03:51:11 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-6ba214cf-e85a-4b47-a6fd-39f3627fc565 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Deleting instance files /opt/stack/data/nova/instances/1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86_del Apr 23 03:51:11 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-6ba214cf-e85a-4b47-a6fd-39f3627fc565 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Deletion of /opt/stack/data/nova/instances/1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86_del complete Apr 23 03:51:11 user nova-compute[71428]: INFO nova.compute.manager [None req-6ba214cf-e85a-4b47-a6fd-39f3627fc565 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Took 0.84 seconds to destroy the instance on the hypervisor. Apr 23 03:51:11 user nova-compute[71428]: DEBUG oslo.service.loopingcall [None req-6ba214cf-e85a-4b47-a6fd-39f3627fc565 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71428) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 23 03:51:11 user nova-compute[71428]: DEBUG nova.compute.manager [-] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Deallocating network for instance {{(pid=71428) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 23 03:51:11 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] deallocate_for_instance() {{(pid=71428) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 23 03:51:11 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7c1f1f24-62f5-4e51-8c74-b5b89b6585a4/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:51:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7c1f1f24-62f5-4e51-8c74-b5b89b6585a4/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:51:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7c1f1f24-62f5-4e51-8c74-b5b89b6585a4/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:51:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7c1f1f24-62f5-4e51-8c74-b5b89b6585a4/disk --force-share --output=json" returned: 0 in 0.171s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:51:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/90cce4ec-f48c-408f-8d8f-46414bde01df/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:51:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/90cce4ec-f48c-408f-8d8f-46414bde01df/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:51:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/90cce4ec-f48c-408f-8d8f-46414bde01df/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:51:12 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Updating instance_info_cache with network_info: [] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:51:12 user nova-compute[71428]: DEBUG nova.compute.manager [req-94e8d73a-7985-474c-b848-d1e250a21311 req-f3ce1ffa-c36c-4e43-b140-dc4f3d58ec32 service nova] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Received event network-vif-deleted-dbcb6234-6b80-49f9-b7a1-0f339da7084a {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:51:12 user nova-compute[71428]: INFO nova.compute.manager [req-94e8d73a-7985-474c-b848-d1e250a21311 req-f3ce1ffa-c36c-4e43-b140-dc4f3d58ec32 service nova] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Neutron deleted interface dbcb6234-6b80-49f9-b7a1-0f339da7084a; detaching it from the instance and deleting it from the info cache Apr 23 03:51:12 user nova-compute[71428]: DEBUG nova.network.neutron [req-94e8d73a-7985-474c-b848-d1e250a21311 req-f3ce1ffa-c36c-4e43-b140-dc4f3d58ec32 service nova] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Updating instance_info_cache with network_info: [] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:51:12 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Took 0.67 seconds to deallocate network for instance. Apr 23 03:51:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/90cce4ec-f48c-408f-8d8f-46414bde01df/disk --force-share --output=json" returned: 0 in 0.164s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:51:12 user nova-compute[71428]: DEBUG nova.compute.manager [req-94e8d73a-7985-474c-b848-d1e250a21311 req-f3ce1ffa-c36c-4e43-b140-dc4f3d58ec32 service nova] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Detach interface failed, port_id=dbcb6234-6b80-49f9-b7a1-0f339da7084a, reason: Instance 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86 could not be found. {{(pid=71428) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 23 03:51:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:51:12 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6ba214cf-e85a-4b47-a6fd-39f3627fc565 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:51:12 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6ba214cf-e85a-4b47-a6fd-39f3627fc565 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.003s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:51:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:51:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:51:12 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-6ba214cf-e85a-4b47-a6fd-39f3627fc565 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:51:12 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-6ba214cf-e85a-4b47-a6fd-39f3627fc565 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:51:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/disk --force-share --output=json" returned: 0 in 0.149s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:51:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:51:12 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6ba214cf-e85a-4b47-a6fd-39f3627fc565 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.301s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:51:12 user nova-compute[71428]: INFO nova.scheduler.client.report [None req-6ba214cf-e85a-4b47-a6fd-39f3627fc565 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Deleted allocations for instance 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86 Apr 23 03:51:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:51:12 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:51:13 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6ba214cf-e85a-4b47-a6fd-39f3627fc565 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lock "1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.050s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:51:13 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:51:13 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Error from libvirt while getting description of instance-0000000b: [Error Code 42] Domain not found: no domain with matching uuid '1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86' (instance-0000000b): libvirt.libvirtError: Domain not found: no domain with matching uuid '1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86' (instance-0000000b) Apr 23 03:51:13 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91756f90-733e-4aa5-9108-d2d8b1d020fe/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:51:13 user nova-compute[71428]: DEBUG nova.compute.manager [req-ea622c0c-4729-4abd-89da-1e36ee90052a req-51d25521-78a6-4dda-acd5-56807db41757 service nova] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Received event network-vif-plugged-dbcb6234-6b80-49f9-b7a1-0f339da7084a {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:51:13 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-ea622c0c-4729-4abd-89da-1e36ee90052a req-51d25521-78a6-4dda-acd5-56807db41757 service nova] Acquiring lock "1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:51:13 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-ea622c0c-4729-4abd-89da-1e36ee90052a req-51d25521-78a6-4dda-acd5-56807db41757 service nova] Lock "1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:51:13 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-ea622c0c-4729-4abd-89da-1e36ee90052a req-51d25521-78a6-4dda-acd5-56807db41757 service nova] Lock "1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:51:13 user nova-compute[71428]: DEBUG nova.compute.manager [req-ea622c0c-4729-4abd-89da-1e36ee90052a req-51d25521-78a6-4dda-acd5-56807db41757 service nova] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] No waiting events found dispatching network-vif-plugged-dbcb6234-6b80-49f9-b7a1-0f339da7084a {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:51:13 user nova-compute[71428]: WARNING nova.compute.manager [req-ea622c0c-4729-4abd-89da-1e36ee90052a req-51d25521-78a6-4dda-acd5-56807db41757 service nova] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Received unexpected event network-vif-plugged-dbcb6234-6b80-49f9-b7a1-0f339da7084a for instance with vm_state deleted and task_state None. Apr 23 03:51:13 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91756f90-733e-4aa5-9108-d2d8b1d020fe/disk --force-share --output=json" returned: 0 in 0.144s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:51:13 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91756f90-733e-4aa5-9108-d2d8b1d020fe/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:51:13 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91756f90-733e-4aa5-9108-d2d8b1d020fe/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:51:13 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:51:13 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:51:13 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Hypervisor/Node resource view: name=user free_ram=8413MB free_disk=26.18225860595703GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71428) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 23 03:51:13 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:51:13 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:51:13 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 954f41b2-5c67-41a4-b38b-ebf3ad60cac7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:51:13 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 91756f90-733e-4aa5-9108-d2d8b1d020fe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:51:13 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance c8480a7c-b5de-4f66-a4f0-08fc679b0dfd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:51:13 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:51:13 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 90cce4ec-f48c-408f-8d8f-46414bde01df actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:51:13 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Total usable vcpus: 12, total allocated vcpus: 5 {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 23 03:51:13 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Final resource view: name=user phys_ram=16023MB used_ram=1152MB phys_disk=40GB used_disk=5GB total_vcpus=12 used_vcpus=5 pci_stats=[] {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 23 03:51:14 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:51:14 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:51:14 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Compute_service record updated for user:user {{(pid=71428) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 23 03:51:14 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.297s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:51:15 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:51:15 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:51:15 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:51:15 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:51:15 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71428) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 23 03:51:16 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:51:16 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Starting heal instance info cache {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 23 03:51:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "refresh_cache-91756f90-733e-4aa5-9108-d2d8b1d020fe" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:51:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquired lock "refresh_cache-91756f90-733e-4aa5-9108-d2d8b1d020fe" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:51:16 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Forcefully refreshing network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 23 03:51:17 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Updating instance_info_cache with network_info: [{"id": "b7f45c1f-8b9b-4d06-a389-d7643565e934", "address": "fa:16:3e:29:a9:72", "network": {"id": "dca689d5-53e5-466b-bd5c-4c29b330d27e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2008984306-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.168", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5eb0a03655cf4aa78e27c81ea4e1c424", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f45c1f-8b", "ovs_interfaceid": "b7f45c1f-8b9b-4d06-a389-d7643565e934", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:51:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Releasing lock "refresh_cache-91756f90-733e-4aa5-9108-d2d8b1d020fe" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:51:17 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Updated the network info_cache for instance {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 23 03:51:19 user nova-compute[71428]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:51:19 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] VM Stopped (Lifecycle Event) Apr 23 03:51:19 user nova-compute[71428]: DEBUG nova.compute.manager [None req-1053662e-badf-4d67-af09-8486f7b542e1 None None] [instance: 6ee9a44e-e6af-4eec-975c-3991146ce71b] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:51:21 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:26 user nova-compute[71428]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:51:26 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] VM Stopped (Lifecycle Event) Apr 23 03:51:26 user nova-compute[71428]: DEBUG nova.compute.manager [None req-cd6ee15c-a277-43b5-bc70-815d0446b0b2 None None] [instance: 1d5ec75f-29f2-45c3-8d6e-74d9ec8b2f86] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:51:26 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:51:31 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:51:33 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:36 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:41 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:46 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:48 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:51 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:54 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:56 user nova-compute[71428]: DEBUG nova.compute.manager [req-a811da9d-24ad-457c-97ac-323b43183866 req-df9bf5e1-b4dc-4103-9425-ded6a644cc1f service nova] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Received event network-changed-2a195833-5430-4cc1-a938-5b4157c32300 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:51:56 user nova-compute[71428]: DEBUG nova.compute.manager [req-a811da9d-24ad-457c-97ac-323b43183866 req-df9bf5e1-b4dc-4103-9425-ded6a644cc1f service nova] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Refreshing instance network info cache due to event network-changed-2a195833-5430-4cc1-a938-5b4157c32300. {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 23 03:51:56 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-a811da9d-24ad-457c-97ac-323b43183866 req-df9bf5e1-b4dc-4103-9425-ded6a644cc1f service nova] Acquiring lock "refresh_cache-7c1f1f24-62f5-4e51-8c74-b5b89b6585a4" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:51:56 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-a811da9d-24ad-457c-97ac-323b43183866 req-df9bf5e1-b4dc-4103-9425-ded6a644cc1f service nova] Acquired lock "refresh_cache-7c1f1f24-62f5-4e51-8c74-b5b89b6585a4" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:51:56 user nova-compute[71428]: DEBUG nova.network.neutron [req-a811da9d-24ad-457c-97ac-323b43183866 req-df9bf5e1-b4dc-4103-9425-ded6a644cc1f service nova] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Refreshing network info cache for port 2a195833-5430-4cc1-a938-5b4157c32300 {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 23 03:51:56 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:57 user nova-compute[71428]: DEBUG nova.network.neutron [req-a811da9d-24ad-457c-97ac-323b43183866 req-df9bf5e1-b4dc-4103-9425-ded6a644cc1f service nova] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Updated VIF entry in instance network info cache for port 2a195833-5430-4cc1-a938-5b4157c32300. {{(pid=71428) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 23 03:51:57 user nova-compute[71428]: DEBUG nova.network.neutron [req-a811da9d-24ad-457c-97ac-323b43183866 req-df9bf5e1-b4dc-4103-9425-ded6a644cc1f service nova] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Updating instance_info_cache with network_info: [{"id": "2a195833-5430-4cc1-a938-5b4157c32300", "address": "fa:16:3e:c3:a4:eb", "network": {"id": "6adb189c-9b46-4da4-a34b-31edf8685498", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-759967402-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.134", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "24fff486a500421397ecb935828582cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a195833-54", "ovs_interfaceid": "2a195833-5430-4cc1-a938-5b4157c32300", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:51:57 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-a811da9d-24ad-457c-97ac-323b43183866 req-df9bf5e1-b4dc-4103-9425-ded6a644cc1f service nova] Releasing lock "refresh_cache-7c1f1f24-62f5-4e51-8c74-b5b89b6585a4" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:51:57 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:57 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f5a4f6f0-adfe-4bbf-89e4-440aa90c9188 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Acquiring lock "7c1f1f24-62f5-4e51-8c74-b5b89b6585a4" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:51:57 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f5a4f6f0-adfe-4bbf-89e4-440aa90c9188 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "7c1f1f24-62f5-4e51-8c74-b5b89b6585a4" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:51:57 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f5a4f6f0-adfe-4bbf-89e4-440aa90c9188 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Acquiring lock "7c1f1f24-62f5-4e51-8c74-b5b89b6585a4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:51:57 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f5a4f6f0-adfe-4bbf-89e4-440aa90c9188 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "7c1f1f24-62f5-4e51-8c74-b5b89b6585a4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:51:57 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f5a4f6f0-adfe-4bbf-89e4-440aa90c9188 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "7c1f1f24-62f5-4e51-8c74-b5b89b6585a4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:51:57 user nova-compute[71428]: INFO nova.compute.manager [None req-f5a4f6f0-adfe-4bbf-89e4-440aa90c9188 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Terminating instance Apr 23 03:51:57 user nova-compute[71428]: DEBUG nova.compute.manager [None req-f5a4f6f0-adfe-4bbf-89e4-440aa90c9188 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Start destroying the instance on the hypervisor. {{(pid=71428) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 23 03:51:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:58 user nova-compute[71428]: DEBUG nova.compute.manager [req-a699f8e1-a04d-4eb9-9e0c-30535965a791 req-8e75d13b-c86e-4e6b-9d5c-969b84225609 service nova] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Received event network-vif-unplugged-2a195833-5430-4cc1-a938-5b4157c32300 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:51:58 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-a699f8e1-a04d-4eb9-9e0c-30535965a791 req-8e75d13b-c86e-4e6b-9d5c-969b84225609 service nova] Acquiring lock "7c1f1f24-62f5-4e51-8c74-b5b89b6585a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:51:58 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-a699f8e1-a04d-4eb9-9e0c-30535965a791 req-8e75d13b-c86e-4e6b-9d5c-969b84225609 service nova] Lock "7c1f1f24-62f5-4e51-8c74-b5b89b6585a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:51:58 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-a699f8e1-a04d-4eb9-9e0c-30535965a791 req-8e75d13b-c86e-4e6b-9d5c-969b84225609 service nova] Lock "7c1f1f24-62f5-4e51-8c74-b5b89b6585a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:51:58 user nova-compute[71428]: DEBUG nova.compute.manager [req-a699f8e1-a04d-4eb9-9e0c-30535965a791 req-8e75d13b-c86e-4e6b-9d5c-969b84225609 service nova] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] No waiting events found dispatching network-vif-unplugged-2a195833-5430-4cc1-a938-5b4157c32300 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:51:58 user nova-compute[71428]: DEBUG nova.compute.manager [req-a699f8e1-a04d-4eb9-9e0c-30535965a791 req-8e75d13b-c86e-4e6b-9d5c-969b84225609 service nova] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Received event network-vif-unplugged-2a195833-5430-4cc1-a938-5b4157c32300 for instance with task_state deleting. {{(pid=71428) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 23 03:51:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:58 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Instance destroyed successfully. Apr 23 03:51:58 user nova-compute[71428]: DEBUG nova.objects.instance [None req-f5a4f6f0-adfe-4bbf-89e4-440aa90c9188 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lazy-loading 'resources' on Instance uuid 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:51:58 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-f5a4f6f0-adfe-4bbf-89e4-440aa90c9188 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:50:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-458473256',display_name='tempest-AttachVolumeNegativeTest-server-458473256',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-458473256',id=12,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPW+NA3vXdFxJXRfoNCWzZ6Pz/y62h/Owo9QJnwMZPgQAIglks3olXUz15miaWu+L14NHfoTREI5jCse5O5kM4puAOyJ3IeJt0XKaKpq44fQskU03DPweyp4+IKVldZSew==',key_name='tempest-keypair-152318519',keypairs=,launch_index=0,launched_at=2023-04-23T03:50:11Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='24fff486a500421397ecb935828582cd',ramdisk_id='',reservation_id='r-30oc0m0z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeNegativeTest-636753786',owner_user_name='tempest-AttachVolumeNegativeTest-636753786-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-23T03:50:11Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0a2c459cad014b07b2613e5e261d88aa',uuid=7c1f1f24-62f5-4e51-8c74-b5b89b6585a4,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2a195833-5430-4cc1-a938-5b4157c32300", "address": "fa:16:3e:c3:a4:eb", "network": {"id": "6adb189c-9b46-4da4-a34b-31edf8685498", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-759967402-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.134", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "24fff486a500421397ecb935828582cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a195833-54", "ovs_interfaceid": "2a195833-5430-4cc1-a938-5b4157c32300", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 23 03:51:58 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-f5a4f6f0-adfe-4bbf-89e4-440aa90c9188 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Converting VIF {"id": "2a195833-5430-4cc1-a938-5b4157c32300", "address": "fa:16:3e:c3:a4:eb", "network": {"id": "6adb189c-9b46-4da4-a34b-31edf8685498", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-759967402-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.134", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "24fff486a500421397ecb935828582cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a195833-54", "ovs_interfaceid": "2a195833-5430-4cc1-a938-5b4157c32300", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:51:58 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-f5a4f6f0-adfe-4bbf-89e4-440aa90c9188 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c3:a4:eb,bridge_name='br-int',has_traffic_filtering=True,id=2a195833-5430-4cc1-a938-5b4157c32300,network=Network(6adb189c-9b46-4da4-a34b-31edf8685498),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a195833-54') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:51:58 user nova-compute[71428]: DEBUG os_vif [None req-f5a4f6f0-adfe-4bbf-89e4-440aa90c9188 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c3:a4:eb,bridge_name='br-int',has_traffic_filtering=True,id=2a195833-5430-4cc1-a938-5b4157c32300,network=Network(6adb189c-9b46-4da4-a34b-31edf8685498),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a195833-54') {{(pid=71428) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 23 03:51:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2a195833-54, bridge=br-int, if_exists=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:51:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:51:58 user nova-compute[71428]: INFO os_vif [None req-f5a4f6f0-adfe-4bbf-89e4-440aa90c9188 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c3:a4:eb,bridge_name='br-int',has_traffic_filtering=True,id=2a195833-5430-4cc1-a938-5b4157c32300,network=Network(6adb189c-9b46-4da4-a34b-31edf8685498),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a195833-54') Apr 23 03:51:58 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-f5a4f6f0-adfe-4bbf-89e4-440aa90c9188 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Deleting instance files /opt/stack/data/nova/instances/7c1f1f24-62f5-4e51-8c74-b5b89b6585a4_del Apr 23 03:51:58 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-f5a4f6f0-adfe-4bbf-89e4-440aa90c9188 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Deletion of /opt/stack/data/nova/instances/7c1f1f24-62f5-4e51-8c74-b5b89b6585a4_del complete Apr 23 03:51:58 user nova-compute[71428]: INFO nova.compute.manager [None req-f5a4f6f0-adfe-4bbf-89e4-440aa90c9188 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Took 0.69 seconds to destroy the instance on the hypervisor. Apr 23 03:51:58 user nova-compute[71428]: DEBUG oslo.service.loopingcall [None req-f5a4f6f0-adfe-4bbf-89e4-440aa90c9188 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71428) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 23 03:51:58 user nova-compute[71428]: DEBUG nova.compute.manager [-] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Deallocating network for instance {{(pid=71428) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 23 03:51:58 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] deallocate_for_instance() {{(pid=71428) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 23 03:51:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:59 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:59 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:59 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:59 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:59 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:51:59 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:00 user nova-compute[71428]: DEBUG nova.compute.manager [req-37969772-c240-4c08-9616-371d05ccd222 req-995db8f4-8a52-414d-bc0d-ca2cc638480f service nova] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Received event network-vif-plugged-2a195833-5430-4cc1-a938-5b4157c32300 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:52:00 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-37969772-c240-4c08-9616-371d05ccd222 req-995db8f4-8a52-414d-bc0d-ca2cc638480f service nova] Acquiring lock "7c1f1f24-62f5-4e51-8c74-b5b89b6585a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:00 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-37969772-c240-4c08-9616-371d05ccd222 req-995db8f4-8a52-414d-bc0d-ca2cc638480f service nova] Lock "7c1f1f24-62f5-4e51-8c74-b5b89b6585a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:00 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-37969772-c240-4c08-9616-371d05ccd222 req-995db8f4-8a52-414d-bc0d-ca2cc638480f service nova] Lock "7c1f1f24-62f5-4e51-8c74-b5b89b6585a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:00 user nova-compute[71428]: DEBUG nova.compute.manager [req-37969772-c240-4c08-9616-371d05ccd222 req-995db8f4-8a52-414d-bc0d-ca2cc638480f service nova] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] No waiting events found dispatching network-vif-plugged-2a195833-5430-4cc1-a938-5b4157c32300 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:52:00 user nova-compute[71428]: WARNING nova.compute.manager [req-37969772-c240-4c08-9616-371d05ccd222 req-995db8f4-8a52-414d-bc0d-ca2cc638480f service nova] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Received unexpected event network-vif-plugged-2a195833-5430-4cc1-a938-5b4157c32300 for instance with vm_state active and task_state deleting. Apr 23 03:52:00 user nova-compute[71428]: DEBUG nova.compute.manager [req-37969772-c240-4c08-9616-371d05ccd222 req-995db8f4-8a52-414d-bc0d-ca2cc638480f service nova] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Received event network-vif-plugged-2a195833-5430-4cc1-a938-5b4157c32300 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:52:00 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-37969772-c240-4c08-9616-371d05ccd222 req-995db8f4-8a52-414d-bc0d-ca2cc638480f service nova] Acquiring lock "7c1f1f24-62f5-4e51-8c74-b5b89b6585a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:00 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-37969772-c240-4c08-9616-371d05ccd222 req-995db8f4-8a52-414d-bc0d-ca2cc638480f service nova] Lock "7c1f1f24-62f5-4e51-8c74-b5b89b6585a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:00 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-37969772-c240-4c08-9616-371d05ccd222 req-995db8f4-8a52-414d-bc0d-ca2cc638480f service nova] Lock "7c1f1f24-62f5-4e51-8c74-b5b89b6585a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:00 user nova-compute[71428]: DEBUG nova.compute.manager [req-37969772-c240-4c08-9616-371d05ccd222 req-995db8f4-8a52-414d-bc0d-ca2cc638480f service nova] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] No waiting events found dispatching network-vif-plugged-2a195833-5430-4cc1-a938-5b4157c32300 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:52:00 user nova-compute[71428]: WARNING nova.compute.manager [req-37969772-c240-4c08-9616-371d05ccd222 req-995db8f4-8a52-414d-bc0d-ca2cc638480f service nova] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Received unexpected event network-vif-plugged-2a195833-5430-4cc1-a938-5b4157c32300 for instance with vm_state active and task_state deleting. Apr 23 03:52:00 user nova-compute[71428]: DEBUG nova.compute.manager [req-37969772-c240-4c08-9616-371d05ccd222 req-995db8f4-8a52-414d-bc0d-ca2cc638480f service nova] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Received event network-vif-plugged-2a195833-5430-4cc1-a938-5b4157c32300 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:52:00 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-37969772-c240-4c08-9616-371d05ccd222 req-995db8f4-8a52-414d-bc0d-ca2cc638480f service nova] Acquiring lock "7c1f1f24-62f5-4e51-8c74-b5b89b6585a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:00 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-37969772-c240-4c08-9616-371d05ccd222 req-995db8f4-8a52-414d-bc0d-ca2cc638480f service nova] Lock "7c1f1f24-62f5-4e51-8c74-b5b89b6585a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:00 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-37969772-c240-4c08-9616-371d05ccd222 req-995db8f4-8a52-414d-bc0d-ca2cc638480f service nova] Lock "7c1f1f24-62f5-4e51-8c74-b5b89b6585a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:00 user nova-compute[71428]: DEBUG nova.compute.manager [req-37969772-c240-4c08-9616-371d05ccd222 req-995db8f4-8a52-414d-bc0d-ca2cc638480f service nova] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] No waiting events found dispatching network-vif-plugged-2a195833-5430-4cc1-a938-5b4157c32300 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:52:00 user nova-compute[71428]: WARNING nova.compute.manager [req-37969772-c240-4c08-9616-371d05ccd222 req-995db8f4-8a52-414d-bc0d-ca2cc638480f service nova] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Received unexpected event network-vif-plugged-2a195833-5430-4cc1-a938-5b4157c32300 for instance with vm_state active and task_state deleting. Apr 23 03:52:00 user nova-compute[71428]: DEBUG nova.compute.manager [req-37969772-c240-4c08-9616-371d05ccd222 req-995db8f4-8a52-414d-bc0d-ca2cc638480f service nova] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Received event network-vif-unplugged-2a195833-5430-4cc1-a938-5b4157c32300 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:52:00 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-37969772-c240-4c08-9616-371d05ccd222 req-995db8f4-8a52-414d-bc0d-ca2cc638480f service nova] Acquiring lock "7c1f1f24-62f5-4e51-8c74-b5b89b6585a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:00 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-37969772-c240-4c08-9616-371d05ccd222 req-995db8f4-8a52-414d-bc0d-ca2cc638480f service nova] Lock "7c1f1f24-62f5-4e51-8c74-b5b89b6585a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:00 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-37969772-c240-4c08-9616-371d05ccd222 req-995db8f4-8a52-414d-bc0d-ca2cc638480f service nova] Lock "7c1f1f24-62f5-4e51-8c74-b5b89b6585a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:00 user nova-compute[71428]: DEBUG nova.compute.manager [req-37969772-c240-4c08-9616-371d05ccd222 req-995db8f4-8a52-414d-bc0d-ca2cc638480f service nova] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] No waiting events found dispatching network-vif-unplugged-2a195833-5430-4cc1-a938-5b4157c32300 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:52:00 user nova-compute[71428]: DEBUG nova.compute.manager [req-37969772-c240-4c08-9616-371d05ccd222 req-995db8f4-8a52-414d-bc0d-ca2cc638480f service nova] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Received event network-vif-unplugged-2a195833-5430-4cc1-a938-5b4157c32300 for instance with task_state deleting. {{(pid=71428) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 23 03:52:00 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Updating instance_info_cache with network_info: [] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:52:00 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Took 1.85 seconds to deallocate network for instance. Apr 23 03:52:00 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f5a4f6f0-adfe-4bbf-89e4-440aa90c9188 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:00 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f5a4f6f0-adfe-4bbf-89e4-440aa90c9188 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:00 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-f5a4f6f0-adfe-4bbf-89e4-440aa90c9188 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:52:00 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-f5a4f6f0-adfe-4bbf-89e4-440aa90c9188 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:52:00 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f5a4f6f0-adfe-4bbf-89e4-440aa90c9188 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.187s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:00 user nova-compute[71428]: INFO nova.scheduler.client.report [None req-f5a4f6f0-adfe-4bbf-89e4-440aa90c9188 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Deleted allocations for instance 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4 Apr 23 03:52:00 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f5a4f6f0-adfe-4bbf-89e4-440aa90c9188 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "7c1f1f24-62f5-4e51-8c74-b5b89b6585a4" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.912s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:01 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f9eaf523-7034-4ec3-8c27-bd4d5e3d74c9 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Acquiring lock "91756f90-733e-4aa5-9108-d2d8b1d020fe" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:01 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f9eaf523-7034-4ec3-8c27-bd4d5e3d74c9 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lock "91756f90-733e-4aa5-9108-d2d8b1d020fe" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:01 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f9eaf523-7034-4ec3-8c27-bd4d5e3d74c9 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Acquiring lock "91756f90-733e-4aa5-9108-d2d8b1d020fe-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:01 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f9eaf523-7034-4ec3-8c27-bd4d5e3d74c9 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lock "91756f90-733e-4aa5-9108-d2d8b1d020fe-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.002s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:01 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f9eaf523-7034-4ec3-8c27-bd4d5e3d74c9 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lock "91756f90-733e-4aa5-9108-d2d8b1d020fe-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:01 user nova-compute[71428]: INFO nova.compute.manager [None req-f9eaf523-7034-4ec3-8c27-bd4d5e3d74c9 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Terminating instance Apr 23 03:52:01 user nova-compute[71428]: DEBUG nova.compute.manager [None req-f9eaf523-7034-4ec3-8c27-bd4d5e3d74c9 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Start destroying the instance on the hypervisor. {{(pid=71428) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 23 03:52:01 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:01 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:01 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:01 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:01 user nova-compute[71428]: DEBUG nova.compute.manager [req-e33b5c73-888b-4650-89c9-6a795ebb459b req-a754cf2d-10df-4837-83be-6b03ffba19e2 service nova] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Received event network-vif-unplugged-b7f45c1f-8b9b-4d06-a389-d7643565e934 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:52:01 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-e33b5c73-888b-4650-89c9-6a795ebb459b req-a754cf2d-10df-4837-83be-6b03ffba19e2 service nova] Acquiring lock "91756f90-733e-4aa5-9108-d2d8b1d020fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:01 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-e33b5c73-888b-4650-89c9-6a795ebb459b req-a754cf2d-10df-4837-83be-6b03ffba19e2 service nova] Lock "91756f90-733e-4aa5-9108-d2d8b1d020fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:01 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-e33b5c73-888b-4650-89c9-6a795ebb459b req-a754cf2d-10df-4837-83be-6b03ffba19e2 service nova] Lock "91756f90-733e-4aa5-9108-d2d8b1d020fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:01 user nova-compute[71428]: DEBUG nova.compute.manager [req-e33b5c73-888b-4650-89c9-6a795ebb459b req-a754cf2d-10df-4837-83be-6b03ffba19e2 service nova] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] No waiting events found dispatching network-vif-unplugged-b7f45c1f-8b9b-4d06-a389-d7643565e934 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:52:01 user nova-compute[71428]: DEBUG nova.compute.manager [req-e33b5c73-888b-4650-89c9-6a795ebb459b req-a754cf2d-10df-4837-83be-6b03ffba19e2 service nova] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Received event network-vif-unplugged-b7f45c1f-8b9b-4d06-a389-d7643565e934 for instance with task_state deleting. {{(pid=71428) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 23 03:52:02 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:02 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:02 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:02 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:02 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Instance destroyed successfully. Apr 23 03:52:02 user nova-compute[71428]: DEBUG nova.objects.instance [None req-f9eaf523-7034-4ec3-8c27-bd4d5e3d74c9 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lazy-loading 'resources' on Instance uuid 91756f90-733e-4aa5-9108-d2d8b1d020fe {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:52:02 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-f9eaf523-7034-4ec3-8c27-bd4d5e3d74c9 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:47:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1191600208',display_name='tempest-VolumesAdminNegativeTest-server-1191600208',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-1191600208',id=7,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAkS25YZ2FFDYFsUr7jiuPczrSOLIzJlU2a693RRWPK2F4nPp0/U+VjXRqgfa72DvpZuiWhq7U6qcmOGOvf1l/Ty3NboreHITR0GpknbVZF5naj3IXms4m51pMYQyScIuQ==',key_name='tempest-keypair-780145109',keypairs=,launch_index=0,launched_at=2023-04-23T03:47:35Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='5eb0a03655cf4aa78e27c81ea4e1c424',ramdisk_id='',reservation_id='r-9jr4wy53',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-VolumesAdminNegativeTest-594073230',owner_user_name='tempest-VolumesAdminNegativeTest-594073230-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-23T03:47:35Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='3e80c354abd34bd3a28ddaeec9535af2',uuid=91756f90-733e-4aa5-9108-d2d8b1d020fe,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b7f45c1f-8b9b-4d06-a389-d7643565e934", "address": "fa:16:3e:29:a9:72", "network": {"id": "dca689d5-53e5-466b-bd5c-4c29b330d27e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2008984306-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.168", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5eb0a03655cf4aa78e27c81ea4e1c424", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f45c1f-8b", "ovs_interfaceid": "b7f45c1f-8b9b-4d06-a389-d7643565e934", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 23 03:52:02 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-f9eaf523-7034-4ec3-8c27-bd4d5e3d74c9 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Converting VIF {"id": "b7f45c1f-8b9b-4d06-a389-d7643565e934", "address": "fa:16:3e:29:a9:72", "network": {"id": "dca689d5-53e5-466b-bd5c-4c29b330d27e", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2008984306-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.168", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5eb0a03655cf4aa78e27c81ea4e1c424", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f45c1f-8b", "ovs_interfaceid": "b7f45c1f-8b9b-4d06-a389-d7643565e934", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:52:02 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-f9eaf523-7034-4ec3-8c27-bd4d5e3d74c9 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:29:a9:72,bridge_name='br-int',has_traffic_filtering=True,id=b7f45c1f-8b9b-4d06-a389-d7643565e934,network=Network(dca689d5-53e5-466b-bd5c-4c29b330d27e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7f45c1f-8b') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:52:02 user nova-compute[71428]: DEBUG os_vif [None req-f9eaf523-7034-4ec3-8c27-bd4d5e3d74c9 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:29:a9:72,bridge_name='br-int',has_traffic_filtering=True,id=b7f45c1f-8b9b-4d06-a389-d7643565e934,network=Network(dca689d5-53e5-466b-bd5c-4c29b330d27e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7f45c1f-8b') {{(pid=71428) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 23 03:52:02 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:02 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb7f45c1f-8b, bridge=br-int, if_exists=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:52:02 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:02 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:02 user nova-compute[71428]: INFO os_vif [None req-f9eaf523-7034-4ec3-8c27-bd4d5e3d74c9 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:29:a9:72,bridge_name='br-int',has_traffic_filtering=True,id=b7f45c1f-8b9b-4d06-a389-d7643565e934,network=Network(dca689d5-53e5-466b-bd5c-4c29b330d27e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7f45c1f-8b') Apr 23 03:52:02 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-f9eaf523-7034-4ec3-8c27-bd4d5e3d74c9 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Deleting instance files /opt/stack/data/nova/instances/91756f90-733e-4aa5-9108-d2d8b1d020fe_del Apr 23 03:52:02 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-f9eaf523-7034-4ec3-8c27-bd4d5e3d74c9 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Deletion of /opt/stack/data/nova/instances/91756f90-733e-4aa5-9108-d2d8b1d020fe_del complete Apr 23 03:52:02 user nova-compute[71428]: INFO nova.compute.manager [None req-f9eaf523-7034-4ec3-8c27-bd4d5e3d74c9 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Took 0.72 seconds to destroy the instance on the hypervisor. Apr 23 03:52:02 user nova-compute[71428]: DEBUG oslo.service.loopingcall [None req-f9eaf523-7034-4ec3-8c27-bd4d5e3d74c9 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71428) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 23 03:52:02 user nova-compute[71428]: DEBUG nova.compute.manager [-] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Deallocating network for instance {{(pid=71428) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 23 03:52:02 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] deallocate_for_instance() {{(pid=71428) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 23 03:52:02 user nova-compute[71428]: DEBUG nova.compute.manager [req-bea8bc91-3c09-48cb-9a49-e90efc1200b1 req-9b6a6548-fbd7-442d-a762-68897c8be8c2 service nova] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Received event network-vif-deleted-2a195833-5430-4cc1-a938-5b4157c32300 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:52:02 user nova-compute[71428]: DEBUG nova.compute.manager [req-bea8bc91-3c09-48cb-9a49-e90efc1200b1 req-9b6a6548-fbd7-442d-a762-68897c8be8c2 service nova] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Received event network-vif-plugged-2a195833-5430-4cc1-a938-5b4157c32300 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:52:02 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-bea8bc91-3c09-48cb-9a49-e90efc1200b1 req-9b6a6548-fbd7-442d-a762-68897c8be8c2 service nova] Acquiring lock "7c1f1f24-62f5-4e51-8c74-b5b89b6585a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:02 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-bea8bc91-3c09-48cb-9a49-e90efc1200b1 req-9b6a6548-fbd7-442d-a762-68897c8be8c2 service nova] Lock "7c1f1f24-62f5-4e51-8c74-b5b89b6585a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:02 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-bea8bc91-3c09-48cb-9a49-e90efc1200b1 req-9b6a6548-fbd7-442d-a762-68897c8be8c2 service nova] Lock "7c1f1f24-62f5-4e51-8c74-b5b89b6585a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:02 user nova-compute[71428]: DEBUG nova.compute.manager [req-bea8bc91-3c09-48cb-9a49-e90efc1200b1 req-9b6a6548-fbd7-442d-a762-68897c8be8c2 service nova] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] No waiting events found dispatching network-vif-plugged-2a195833-5430-4cc1-a938-5b4157c32300 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:52:02 user nova-compute[71428]: WARNING nova.compute.manager [req-bea8bc91-3c09-48cb-9a49-e90efc1200b1 req-9b6a6548-fbd7-442d-a762-68897c8be8c2 service nova] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Received unexpected event network-vif-plugged-2a195833-5430-4cc1-a938-5b4157c32300 for instance with vm_state deleted and task_state None. Apr 23 03:52:02 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:02 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:02 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:02 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:02 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:03 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Updating instance_info_cache with network_info: [] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:52:03 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Took 1.10 seconds to deallocate network for instance. Apr 23 03:52:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f9eaf523-7034-4ec3-8c27-bd4d5e3d74c9 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f9eaf523-7034-4ec3-8c27-bd4d5e3d74c9 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:03 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-f9eaf523-7034-4ec3-8c27-bd4d5e3d74c9 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:52:03 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-f9eaf523-7034-4ec3-8c27-bd4d5e3d74c9 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:52:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f9eaf523-7034-4ec3-8c27-bd4d5e3d74c9 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.186s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:03 user nova-compute[71428]: INFO nova.scheduler.client.report [None req-f9eaf523-7034-4ec3-8c27-bd4d5e3d74c9 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Deleted allocations for instance 91756f90-733e-4aa5-9108-d2d8b1d020fe Apr 23 03:52:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f9eaf523-7034-4ec3-8c27-bd4d5e3d74c9 tempest-VolumesAdminNegativeTest-594073230 tempest-VolumesAdminNegativeTest-594073230-project-member] Lock "91756f90-733e-4aa5-9108-d2d8b1d020fe" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.196s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:03 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:03 user nova-compute[71428]: DEBUG nova.compute.manager [req-8db4f709-8b62-4c9d-9efd-582ffd68bd6e req-1fe5f8a7-dc65-4a64-8975-9b64562fd597 service nova] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Received event network-vif-plugged-b7f45c1f-8b9b-4d06-a389-d7643565e934 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:52:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-8db4f709-8b62-4c9d-9efd-582ffd68bd6e req-1fe5f8a7-dc65-4a64-8975-9b64562fd597 service nova] Acquiring lock "91756f90-733e-4aa5-9108-d2d8b1d020fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-8db4f709-8b62-4c9d-9efd-582ffd68bd6e req-1fe5f8a7-dc65-4a64-8975-9b64562fd597 service nova] Lock "91756f90-733e-4aa5-9108-d2d8b1d020fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-8db4f709-8b62-4c9d-9efd-582ffd68bd6e req-1fe5f8a7-dc65-4a64-8975-9b64562fd597 service nova] Lock "91756f90-733e-4aa5-9108-d2d8b1d020fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:03 user nova-compute[71428]: DEBUG nova.compute.manager [req-8db4f709-8b62-4c9d-9efd-582ffd68bd6e req-1fe5f8a7-dc65-4a64-8975-9b64562fd597 service nova] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] No waiting events found dispatching network-vif-plugged-b7f45c1f-8b9b-4d06-a389-d7643565e934 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:52:03 user nova-compute[71428]: WARNING nova.compute.manager [req-8db4f709-8b62-4c9d-9efd-582ffd68bd6e req-1fe5f8a7-dc65-4a64-8975-9b64562fd597 service nova] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Received unexpected event network-vif-plugged-b7f45c1f-8b9b-4d06-a389-d7643565e934 for instance with vm_state deleted and task_state None. Apr 23 03:52:03 user nova-compute[71428]: DEBUG nova.compute.manager [req-8db4f709-8b62-4c9d-9efd-582ffd68bd6e req-1fe5f8a7-dc65-4a64-8975-9b64562fd597 service nova] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Received event network-vif-plugged-b7f45c1f-8b9b-4d06-a389-d7643565e934 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:52:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-8db4f709-8b62-4c9d-9efd-582ffd68bd6e req-1fe5f8a7-dc65-4a64-8975-9b64562fd597 service nova] Acquiring lock "91756f90-733e-4aa5-9108-d2d8b1d020fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-8db4f709-8b62-4c9d-9efd-582ffd68bd6e req-1fe5f8a7-dc65-4a64-8975-9b64562fd597 service nova] Lock "91756f90-733e-4aa5-9108-d2d8b1d020fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.002s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-8db4f709-8b62-4c9d-9efd-582ffd68bd6e req-1fe5f8a7-dc65-4a64-8975-9b64562fd597 service nova] Lock "91756f90-733e-4aa5-9108-d2d8b1d020fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:03 user nova-compute[71428]: DEBUG nova.compute.manager [req-8db4f709-8b62-4c9d-9efd-582ffd68bd6e req-1fe5f8a7-dc65-4a64-8975-9b64562fd597 service nova] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] No waiting events found dispatching network-vif-plugged-b7f45c1f-8b9b-4d06-a389-d7643565e934 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:52:03 user nova-compute[71428]: WARNING nova.compute.manager [req-8db4f709-8b62-4c9d-9efd-582ffd68bd6e req-1fe5f8a7-dc65-4a64-8975-9b64562fd597 service nova] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Received unexpected event network-vif-plugged-b7f45c1f-8b9b-4d06-a389-d7643565e934 for instance with vm_state deleted and task_state None. Apr 23 03:52:03 user nova-compute[71428]: DEBUG nova.compute.manager [req-8db4f709-8b62-4c9d-9efd-582ffd68bd6e req-1fe5f8a7-dc65-4a64-8975-9b64562fd597 service nova] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Received event network-vif-plugged-b7f45c1f-8b9b-4d06-a389-d7643565e934 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:52:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-8db4f709-8b62-4c9d-9efd-582ffd68bd6e req-1fe5f8a7-dc65-4a64-8975-9b64562fd597 service nova] Acquiring lock "91756f90-733e-4aa5-9108-d2d8b1d020fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-8db4f709-8b62-4c9d-9efd-582ffd68bd6e req-1fe5f8a7-dc65-4a64-8975-9b64562fd597 service nova] Lock "91756f90-733e-4aa5-9108-d2d8b1d020fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-8db4f709-8b62-4c9d-9efd-582ffd68bd6e req-1fe5f8a7-dc65-4a64-8975-9b64562fd597 service nova] Lock "91756f90-733e-4aa5-9108-d2d8b1d020fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:03 user nova-compute[71428]: DEBUG nova.compute.manager [req-8db4f709-8b62-4c9d-9efd-582ffd68bd6e req-1fe5f8a7-dc65-4a64-8975-9b64562fd597 service nova] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] No waiting events found dispatching network-vif-plugged-b7f45c1f-8b9b-4d06-a389-d7643565e934 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:52:03 user nova-compute[71428]: WARNING nova.compute.manager [req-8db4f709-8b62-4c9d-9efd-582ffd68bd6e req-1fe5f8a7-dc65-4a64-8975-9b64562fd597 service nova] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Received unexpected event network-vif-plugged-b7f45c1f-8b9b-4d06-a389-d7643565e934 for instance with vm_state deleted and task_state None. Apr 23 03:52:03 user nova-compute[71428]: DEBUG nova.compute.manager [req-8db4f709-8b62-4c9d-9efd-582ffd68bd6e req-1fe5f8a7-dc65-4a64-8975-9b64562fd597 service nova] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Received event network-vif-unplugged-b7f45c1f-8b9b-4d06-a389-d7643565e934 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:52:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-8db4f709-8b62-4c9d-9efd-582ffd68bd6e req-1fe5f8a7-dc65-4a64-8975-9b64562fd597 service nova] Acquiring lock "91756f90-733e-4aa5-9108-d2d8b1d020fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-8db4f709-8b62-4c9d-9efd-582ffd68bd6e req-1fe5f8a7-dc65-4a64-8975-9b64562fd597 service nova] Lock "91756f90-733e-4aa5-9108-d2d8b1d020fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-8db4f709-8b62-4c9d-9efd-582ffd68bd6e req-1fe5f8a7-dc65-4a64-8975-9b64562fd597 service nova] Lock "91756f90-733e-4aa5-9108-d2d8b1d020fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:03 user nova-compute[71428]: DEBUG nova.compute.manager [req-8db4f709-8b62-4c9d-9efd-582ffd68bd6e req-1fe5f8a7-dc65-4a64-8975-9b64562fd597 service nova] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] No waiting events found dispatching network-vif-unplugged-b7f45c1f-8b9b-4d06-a389-d7643565e934 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:52:03 user nova-compute[71428]: WARNING nova.compute.manager [req-8db4f709-8b62-4c9d-9efd-582ffd68bd6e req-1fe5f8a7-dc65-4a64-8975-9b64562fd597 service nova] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Received unexpected event network-vif-unplugged-b7f45c1f-8b9b-4d06-a389-d7643565e934 for instance with vm_state deleted and task_state None. Apr 23 03:52:03 user nova-compute[71428]: DEBUG nova.compute.manager [req-8db4f709-8b62-4c9d-9efd-582ffd68bd6e req-1fe5f8a7-dc65-4a64-8975-9b64562fd597 service nova] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Received event network-vif-plugged-b7f45c1f-8b9b-4d06-a389-d7643565e934 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:52:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-8db4f709-8b62-4c9d-9efd-582ffd68bd6e req-1fe5f8a7-dc65-4a64-8975-9b64562fd597 service nova] Acquiring lock "91756f90-733e-4aa5-9108-d2d8b1d020fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-8db4f709-8b62-4c9d-9efd-582ffd68bd6e req-1fe5f8a7-dc65-4a64-8975-9b64562fd597 service nova] Lock "91756f90-733e-4aa5-9108-d2d8b1d020fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-8db4f709-8b62-4c9d-9efd-582ffd68bd6e req-1fe5f8a7-dc65-4a64-8975-9b64562fd597 service nova] Lock "91756f90-733e-4aa5-9108-d2d8b1d020fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:03 user nova-compute[71428]: DEBUG nova.compute.manager [req-8db4f709-8b62-4c9d-9efd-582ffd68bd6e req-1fe5f8a7-dc65-4a64-8975-9b64562fd597 service nova] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] No waiting events found dispatching network-vif-plugged-b7f45c1f-8b9b-4d06-a389-d7643565e934 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:52:03 user nova-compute[71428]: WARNING nova.compute.manager [req-8db4f709-8b62-4c9d-9efd-582ffd68bd6e req-1fe5f8a7-dc65-4a64-8975-9b64562fd597 service nova] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Received unexpected event network-vif-plugged-b7f45c1f-8b9b-4d06-a389-d7643565e934 for instance with vm_state deleted and task_state None. Apr 23 03:52:04 user nova-compute[71428]: DEBUG nova.compute.manager [req-391da69a-4247-408b-b99c-cf10070218b1 req-2763b4b9-265e-400c-bd32-3e774579aafa service nova] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Received event network-vif-deleted-b7f45c1f-8b9b-4d06-a389-d7643565e934 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:52:06 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Acquiring lock "01477cd2-cdb7-41e0-96d4-3aff879b3093" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:06 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "01477cd2-cdb7-41e0-96d4-3aff879b3093" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:07 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] Starting instance... {{(pid=71428) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 23 03:52:07 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:07 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:07 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71428) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 23 03:52:07 user nova-compute[71428]: INFO nova.compute.claims [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] Claim successful on node user Apr 23 03:52:07 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:07 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:52:07 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:52:07 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.311s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:07 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] Start building networks asynchronously for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 23 03:52:07 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] Allocating IP information in the background. {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 23 03:52:07 user nova-compute[71428]: DEBUG nova.network.neutron [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] allocate_for_instance() {{(pid=71428) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 23 03:52:07 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 23 03:52:07 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] Start building block device mappings for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 23 03:52:07 user nova-compute[71428]: INFO nova.virt.block_device [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] Booting with blank volume at /dev/vda Apr 23 03:52:07 user nova-compute[71428]: DEBUG nova.policy [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '99495726467944c38620831fc93e2856', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a79e09b1c4ae4cc5ab11c3e56ee4f0d9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71428) authorize /opt/stack/nova/nova/policy.py:203}} Apr 23 03:52:07 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:52:08 user nova-compute[71428]: DEBUG nova.network.neutron [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] Successfully created port: 277262c8-889d-4094-9836-f0d796d72230 {{(pid=71428) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 23 03:52:09 user nova-compute[71428]: DEBUG nova.network.neutron [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] Successfully updated port: 277262c8-889d-4094-9836-f0d796d72230 {{(pid=71428) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 23 03:52:09 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Acquiring lock "refresh_cache-01477cd2-cdb7-41e0-96d4-3aff879b3093" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:52:09 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Acquired lock "refresh_cache-01477cd2-cdb7-41e0-96d4-3aff879b3093" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:52:09 user nova-compute[71428]: DEBUG nova.network.neutron [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] Building network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 23 03:52:09 user nova-compute[71428]: DEBUG nova.compute.manager [req-5dd76643-82c4-44e4-baa1-c0f8ab965bba req-de01149e-5475-439b-91f0-68203f3ec169 service nova] [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] Received event network-changed-277262c8-889d-4094-9836-f0d796d72230 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:52:09 user nova-compute[71428]: DEBUG nova.compute.manager [req-5dd76643-82c4-44e4-baa1-c0f8ab965bba req-de01149e-5475-439b-91f0-68203f3ec169 service nova] [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] Refreshing instance network info cache due to event network-changed-277262c8-889d-4094-9836-f0d796d72230. {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 23 03:52:09 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-5dd76643-82c4-44e4-baa1-c0f8ab965bba req-de01149e-5475-439b-91f0-68203f3ec169 service nova] Acquiring lock "refresh_cache-01477cd2-cdb7-41e0-96d4-3aff879b3093" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:52:09 user nova-compute[71428]: DEBUG nova.network.neutron [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] Instance cache missing network info. {{(pid=71428) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 23 03:52:09 user nova-compute[71428]: DEBUG nova.network.neutron [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] Updating instance_info_cache with network_info: [{"id": "277262c8-889d-4094-9836-f0d796d72230", "address": "fa:16:3e:7e:83:0a", "network": {"id": "25de9e6c-4c02-4658-b076-466681d96605", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1644366708-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a79e09b1c4ae4cc5ab11c3e56ee4f0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap277262c8-88", "ovs_interfaceid": "277262c8-889d-4094-9836-f0d796d72230", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:52:09 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Releasing lock "refresh_cache-01477cd2-cdb7-41e0-96d4-3aff879b3093" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:52:09 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] Instance network_info: |[{"id": "277262c8-889d-4094-9836-f0d796d72230", "address": "fa:16:3e:7e:83:0a", "network": {"id": "25de9e6c-4c02-4658-b076-466681d96605", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1644366708-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a79e09b1c4ae4cc5ab11c3e56ee4f0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap277262c8-88", "ovs_interfaceid": "277262c8-889d-4094-9836-f0d796d72230", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 23 03:52:09 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-5dd76643-82c4-44e4-baa1-c0f8ab965bba req-de01149e-5475-439b-91f0-68203f3ec169 service nova] Acquired lock "refresh_cache-01477cd2-cdb7-41e0-96d4-3aff879b3093" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:52:09 user nova-compute[71428]: DEBUG nova.network.neutron [req-5dd76643-82c4-44e4-baa1-c0f8ab965bba req-de01149e-5475-439b-91f0-68203f3ec169 service nova] [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] Refreshing network info cache for port 277262c8-889d-4094-9836-f0d796d72230 {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 23 03:52:09 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:52:09 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:52:10 user nova-compute[71428]: DEBUG nova.network.neutron [req-5dd76643-82c4-44e4-baa1-c0f8ab965bba req-de01149e-5475-439b-91f0-68203f3ec169 service nova] [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] Updated VIF entry in instance network info cache for port 277262c8-889d-4094-9836-f0d796d72230. {{(pid=71428) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 23 03:52:10 user nova-compute[71428]: DEBUG nova.network.neutron [req-5dd76643-82c4-44e4-baa1-c0f8ab965bba req-de01149e-5475-439b-91f0-68203f3ec169 service nova] [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] Updating instance_info_cache with network_info: [{"id": "277262c8-889d-4094-9836-f0d796d72230", "address": "fa:16:3e:7e:83:0a", "network": {"id": "25de9e6c-4c02-4658-b076-466681d96605", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1644366708-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a79e09b1c4ae4cc5ab11c3e56ee4f0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap277262c8-88", "ovs_interfaceid": "277262c8-889d-4094-9836-f0d796d72230", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:52:10 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-5dd76643-82c4-44e4-baa1-c0f8ab965bba req-de01149e-5475-439b-91f0-68203f3ec169 service nova] Releasing lock "refresh_cache-01477cd2-cdb7-41e0-96d4-3aff879b3093" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:52:11 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:52:11 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Cleaning up deleted instances with incomplete migration {{(pid=71428) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 23 03:52:12 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:52:12 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:52:12 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:52:12 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71428) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 23 03:52:12 user nova-compute[71428]: WARNING nova.compute.manager [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Volume id: 5df61a82-82e1-4f41-8e5f-cb7bd483a63d finished being created but its status is error. Apr 23 03:52:12 user nova-compute[71428]: ERROR nova.compute.manager [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] Instance failed block device setup: nova.exception.VolumeNotCreated: Volume 5df61a82-82e1-4f41-8e5f-cb7bd483a63d did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error. Apr 23 03:52:12 user nova-compute[71428]: ERROR nova.compute.manager [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] Traceback (most recent call last): Apr 23 03:52:12 user nova-compute[71428]: ERROR nova.compute.manager [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] File "/opt/stack/nova/nova/compute/manager.py", line 2175, in _prep_block_device Apr 23 03:52:12 user nova-compute[71428]: ERROR nova.compute.manager [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] driver_block_device.attach_block_devices( Apr 23 03:52:12 user nova-compute[71428]: ERROR nova.compute.manager [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] File "/opt/stack/nova/nova/virt/block_device.py", line 936, in attach_block_devices Apr 23 03:52:12 user nova-compute[71428]: ERROR nova.compute.manager [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] _log_and_attach(device) Apr 23 03:52:12 user nova-compute[71428]: ERROR nova.compute.manager [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] File "/opt/stack/nova/nova/virt/block_device.py", line 933, in _log_and_attach Apr 23 03:52:12 user nova-compute[71428]: ERROR nova.compute.manager [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] bdm.attach(*attach_args, **attach_kwargs) Apr 23 03:52:12 user nova-compute[71428]: ERROR nova.compute.manager [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] File "/opt/stack/nova/nova/virt/block_device.py", line 848, in attach Apr 23 03:52:12 user nova-compute[71428]: ERROR nova.compute.manager [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] self.volume_id, self.attachment_id = self._create_volume( Apr 23 03:52:12 user nova-compute[71428]: ERROR nova.compute.manager [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] File "/opt/stack/nova/nova/virt/block_device.py", line 435, in _create_volume Apr 23 03:52:12 user nova-compute[71428]: ERROR nova.compute.manager [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] self._call_wait_func(context, wait_func, volume_api, vol['id']) Apr 23 03:52:12 user nova-compute[71428]: ERROR nova.compute.manager [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] File "/opt/stack/nova/nova/virt/block_device.py", line 785, in _call_wait_func Apr 23 03:52:12 user nova-compute[71428]: ERROR nova.compute.manager [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] with excutils.save_and_reraise_exception(): Apr 23 03:52:12 user nova-compute[71428]: ERROR nova.compute.manager [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ Apr 23 03:52:12 user nova-compute[71428]: ERROR nova.compute.manager [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] self.force_reraise() Apr 23 03:52:12 user nova-compute[71428]: ERROR nova.compute.manager [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise Apr 23 03:52:12 user nova-compute[71428]: ERROR nova.compute.manager [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] raise self.value Apr 23 03:52:12 user nova-compute[71428]: ERROR nova.compute.manager [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] File "/opt/stack/nova/nova/virt/block_device.py", line 783, in _call_wait_func Apr 23 03:52:12 user nova-compute[71428]: ERROR nova.compute.manager [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] wait_func(context, volume_id) Apr 23 03:52:12 user nova-compute[71428]: ERROR nova.compute.manager [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] File "/opt/stack/nova/nova/compute/manager.py", line 1792, in _await_block_device_map_created Apr 23 03:52:12 user nova-compute[71428]: ERROR nova.compute.manager [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] raise exception.VolumeNotCreated(volume_id=vol_id, Apr 23 03:52:12 user nova-compute[71428]: ERROR nova.compute.manager [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] nova.exception.VolumeNotCreated: Volume 5df61a82-82e1-4f41-8e5f-cb7bd483a63d did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error. Apr 23 03:52:12 user nova-compute[71428]: ERROR nova.compute.manager [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] Apr 23 03:52:12 user nova-compute[71428]: DEBUG nova.compute.claims [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] Aborting claim: {{(pid=71428) abort /opt/stack/nova/nova/compute/claims.py:84}} Apr 23 03:52:12 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:12 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:13 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:52:13 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:52:13 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.442s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:13 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] Build of instance 01477cd2-cdb7-41e0-96d4-3aff879b3093 aborted: Volume 5df61a82-82e1-4f41-8e5f-cb7bd483a63d did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error. {{(pid=71428) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2636}} Apr 23 03:52:13 user nova-compute[71428]: DEBUG nova.compute.utils [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] Build of instance 01477cd2-cdb7-41e0-96d4-3aff879b3093 aborted: Volume 5df61a82-82e1-4f41-8e5f-cb7bd483a63d did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error. {{(pid=71428) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} Apr 23 03:52:13 user nova-compute[71428]: ERROR nova.compute.manager [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] Build of instance 01477cd2-cdb7-41e0-96d4-3aff879b3093 aborted: Volume 5df61a82-82e1-4f41-8e5f-cb7bd483a63d did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error.: nova.exception.BuildAbortException: Build of instance 01477cd2-cdb7-41e0-96d4-3aff879b3093 aborted: Volume 5df61a82-82e1-4f41-8e5f-cb7bd483a63d did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error. Apr 23 03:52:13 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] Unplugging VIFs for instance {{(pid=71428) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} Apr 23 03:52:13 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:52:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1895964229',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1895964229',id=14,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a79e09b1c4ae4cc5ab11c3e56ee4f0d9',ramdisk_id='',reservation_id='r-pcmgj6lu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-653032906',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member'},tags=TagList,task_state='block_device_mapping',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:52:07Z,user_data=None,user_id='99495726467944c38620831fc93e2856',uuid=01477cd2-cdb7-41e0-96d4-3aff879b3093,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "277262c8-889d-4094-9836-f0d796d72230", "address": "fa:16:3e:7e:83:0a", "network": {"id": "25de9e6c-4c02-4658-b076-466681d96605", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1644366708-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a79e09b1c4ae4cc5ab11c3e56ee4f0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap277262c8-88", "ovs_interfaceid": "277262c8-889d-4094-9836-f0d796d72230", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 23 03:52:13 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Converting VIF {"id": "277262c8-889d-4094-9836-f0d796d72230", "address": "fa:16:3e:7e:83:0a", "network": {"id": "25de9e6c-4c02-4658-b076-466681d96605", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1644366708-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a79e09b1c4ae4cc5ab11c3e56ee4f0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap277262c8-88", "ovs_interfaceid": "277262c8-889d-4094-9836-f0d796d72230", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:52:13 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7e:83:0a,bridge_name='br-int',has_traffic_filtering=True,id=277262c8-889d-4094-9836-f0d796d72230,network=Network(25de9e6c-4c02-4658-b076-466681d96605),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap277262c8-88') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:52:13 user nova-compute[71428]: DEBUG os_vif [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7e:83:0a,bridge_name='br-int',has_traffic_filtering=True,id=277262c8-889d-4094-9836-f0d796d72230,network=Network(25de9e6c-4c02-4658-b076-466681d96605),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap277262c8-88') {{(pid=71428) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 23 03:52:13 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:13 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap277262c8-88, bridge=br-int, if_exists=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:52:13 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 23 03:52:13 user nova-compute[71428]: INFO os_vif [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7e:83:0a,bridge_name='br-int',has_traffic_filtering=True,id=277262c8-889d-4094-9836-f0d796d72230,network=Network(25de9e6c-4c02-4658-b076-466681d96605),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap277262c8-88') Apr 23 03:52:13 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] Unplugged VIFs for instance {{(pid=71428) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} Apr 23 03:52:13 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] Deallocating network for instance {{(pid=71428) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 23 03:52:13 user nova-compute[71428]: DEBUG nova.network.neutron [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] deallocate_for_instance() {{(pid=71428) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 23 03:52:13 user nova-compute[71428]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:52:13 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] VM Stopped (Lifecycle Event) Apr 23 03:52:13 user nova-compute[71428]: DEBUG nova.compute.manager [None req-54d4e4c4-03c5-4984-8c9f-ebcf6e9839c9 None None] [instance: 7c1f1f24-62f5-4e51-8c74-b5b89b6585a4] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:52:13 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:52:13 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:13 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:13 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:13 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Auditing locally available compute resources for user (node: user) {{(pid=71428) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 23 03:52:13 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/90cce4ec-f48c-408f-8d8f-46414bde01df/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:52:13 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/90cce4ec-f48c-408f-8d8f-46414bde01df/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:52:13 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/90cce4ec-f48c-408f-8d8f-46414bde01df/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:52:14 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/90cce4ec-f48c-408f-8d8f-46414bde01df/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:52:14 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:52:14 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json" returned: 0 in 0.150s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:52:14 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:52:14 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json" returned: 0 in 0.157s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:52:14 user nova-compute[71428]: DEBUG nova.network.neutron [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] Updating instance_info_cache with network_info: [] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:52:14 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:52:14 user nova-compute[71428]: INFO nova.compute.manager [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 01477cd2-cdb7-41e0-96d4-3aff879b3093] Took 1.05 seconds to deallocate network for instance. Apr 23 03:52:14 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c1d74b78-c254-4d71-b634-27382b03030a tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Acquiring lock "90cce4ec-f48c-408f-8d8f-46414bde01df" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:14 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c1d74b78-c254-4d71-b634-27382b03030a tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lock "90cce4ec-f48c-408f-8d8f-46414bde01df" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:14 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c1d74b78-c254-4d71-b634-27382b03030a tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Acquiring lock "90cce4ec-f48c-408f-8d8f-46414bde01df-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:14 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c1d74b78-c254-4d71-b634-27382b03030a tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lock "90cce4ec-f48c-408f-8d8f-46414bde01df-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:14 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c1d74b78-c254-4d71-b634-27382b03030a tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lock "90cce4ec-f48c-408f-8d8f-46414bde01df-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:14 user nova-compute[71428]: INFO nova.compute.manager [None req-c1d74b78-c254-4d71-b634-27382b03030a tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Terminating instance Apr 23 03:52:14 user nova-compute[71428]: DEBUG nova.compute.manager [None req-c1d74b78-c254-4d71-b634-27382b03030a tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Start destroying the instance on the hypervisor. {{(pid=71428) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 23 03:52:14 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/disk --force-share --output=json" returned: 0 in 0.144s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:52:14 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:52:14 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:14 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:14 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:14 user nova-compute[71428]: INFO nova.scheduler.client.report [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Deleted allocations for instance 01477cd2-cdb7-41e0-96d4-3aff879b3093 Apr 23 03:52:14 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-e6b88cf4-0fc3-4fe0-a9c7-d5c6d7961657 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "01477cd2-cdb7-41e0-96d4-3aff879b3093" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.723s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:14 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/disk --force-share --output=json" returned: 0 in 0.150s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:52:15 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:15 user nova-compute[71428]: DEBUG nova.compute.manager [req-6afe64c2-8dd5-4fb4-9f7b-9d5f527d777f req-35ccd24f-1453-4cf4-8a57-6b9a8f7be9af service nova] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Received event network-vif-unplugged-c9e332a6-c2c7-41d3-9646-e5f672d8cb95 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:52:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-6afe64c2-8dd5-4fb4-9f7b-9d5f527d777f req-35ccd24f-1453-4cf4-8a57-6b9a8f7be9af service nova] Acquiring lock "90cce4ec-f48c-408f-8d8f-46414bde01df-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-6afe64c2-8dd5-4fb4-9f7b-9d5f527d777f req-35ccd24f-1453-4cf4-8a57-6b9a8f7be9af service nova] Lock "90cce4ec-f48c-408f-8d8f-46414bde01df-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-6afe64c2-8dd5-4fb4-9f7b-9d5f527d777f req-35ccd24f-1453-4cf4-8a57-6b9a8f7be9af service nova] Lock "90cce4ec-f48c-408f-8d8f-46414bde01df-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:15 user nova-compute[71428]: DEBUG nova.compute.manager [req-6afe64c2-8dd5-4fb4-9f7b-9d5f527d777f req-35ccd24f-1453-4cf4-8a57-6b9a8f7be9af service nova] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] No waiting events found dispatching network-vif-unplugged-c9e332a6-c2c7-41d3-9646-e5f672d8cb95 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:52:15 user nova-compute[71428]: DEBUG nova.compute.manager [req-6afe64c2-8dd5-4fb4-9f7b-9d5f527d777f req-35ccd24f-1453-4cf4-8a57-6b9a8f7be9af service nova] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Received event network-vif-unplugged-c9e332a6-c2c7-41d3-9646-e5f672d8cb95 for instance with task_state deleting. {{(pid=71428) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 23 03:52:15 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:52:15 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:52:15 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Hypervisor/Node resource view: name=user free_ram=8707MB free_disk=26.22916030883789GB free_vcpus=9 pci_devices=[{"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71428) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 23 03:52:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:15 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Instance destroyed successfully. Apr 23 03:52:15 user nova-compute[71428]: DEBUG nova.objects.instance [None req-c1d74b78-c254-4d71-b634-27382b03030a tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lazy-loading 'resources' on Instance uuid 90cce4ec-f48c-408f-8d8f-46414bde01df {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:52:15 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-c1d74b78-c254-4d71-b634-27382b03030a tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:50:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-524528732',display_name='tempest-TestMinimumBasicScenario-server-524528732',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-524528732',id=13,image_ref='9c1b8f6e-3455-4961-bbec-6a8a6708371c',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIAmBt4gctDSizuDDJrh4pQ0geBulERx8NzMYT+fgBg4gRMdKT3R/5EiARevhCiAiWq2K21pOaqKy/T04eohZfy3bE6LZ/4YHDUN1RqyVnppPqbYR1ciJUTaIPmbmDg0QQ==',key_name='tempest-TestMinimumBasicScenario-962162734',keypairs=,launch_index=0,launched_at=2023-04-23T03:50:28Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='8ab0f01751954d04a83b360b2f839716',ramdisk_id='',reservation_id='r-m79x3jny',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='9c1b8f6e-3455-4961-bbec-6a8a6708371c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestMinimumBasicScenario-1558592168',owner_user_name='tempest-TestMinimumBasicScenario-1558592168-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-23T03:50:29Z,user_data=None,user_id='93a8dbfd8cef4578aff742813ffe901e',uuid=90cce4ec-f48c-408f-8d8f-46414bde01df,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c9e332a6-c2c7-41d3-9646-e5f672d8cb95", "address": "fa:16:3e:58:83:cd", "network": {"id": "a603a26f-3b06-4792-80ae-73cea9f9d53b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1385818779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8ab0f01751954d04a83b360b2f839716", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9e332a6-c2", "ovs_interfaceid": "c9e332a6-c2c7-41d3-9646-e5f672d8cb95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 23 03:52:15 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-c1d74b78-c254-4d71-b634-27382b03030a tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Converting VIF {"id": "c9e332a6-c2c7-41d3-9646-e5f672d8cb95", "address": "fa:16:3e:58:83:cd", "network": {"id": "a603a26f-3b06-4792-80ae-73cea9f9d53b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1385818779-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8ab0f01751954d04a83b360b2f839716", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9e332a6-c2", "ovs_interfaceid": "c9e332a6-c2c7-41d3-9646-e5f672d8cb95", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:52:15 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-c1d74b78-c254-4d71-b634-27382b03030a tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:83:cd,bridge_name='br-int',has_traffic_filtering=True,id=c9e332a6-c2c7-41d3-9646-e5f672d8cb95,network=Network(a603a26f-3b06-4792-80ae-73cea9f9d53b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9e332a6-c2') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:52:15 user nova-compute[71428]: DEBUG os_vif [None req-c1d74b78-c254-4d71-b634-27382b03030a tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:83:cd,bridge_name='br-int',has_traffic_filtering=True,id=c9e332a6-c2c7-41d3-9646-e5f672d8cb95,network=Network(a603a26f-3b06-4792-80ae-73cea9f9d53b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9e332a6-c2') {{(pid=71428) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 23 03:52:15 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:15 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc9e332a6-c2, bridge=br-int, if_exists=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:52:15 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:15 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:52:15 user nova-compute[71428]: INFO os_vif [None req-c1d74b78-c254-4d71-b634-27382b03030a tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:83:cd,bridge_name='br-int',has_traffic_filtering=True,id=c9e332a6-c2c7-41d3-9646-e5f672d8cb95,network=Network(a603a26f-3b06-4792-80ae-73cea9f9d53b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9e332a6-c2') Apr 23 03:52:15 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-c1d74b78-c254-4d71-b634-27382b03030a tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Deleting instance files /opt/stack/data/nova/instances/90cce4ec-f48c-408f-8d8f-46414bde01df_del Apr 23 03:52:15 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-c1d74b78-c254-4d71-b634-27382b03030a tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Deletion of /opt/stack/data/nova/instances/90cce4ec-f48c-408f-8d8f-46414bde01df_del complete Apr 23 03:52:15 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 954f41b2-5c67-41a4-b38b-ebf3ad60cac7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:52:15 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance c8480a7c-b5de-4f66-a4f0-08fc679b0dfd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:52:15 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 90cce4ec-f48c-408f-8d8f-46414bde01df actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:52:15 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Total usable vcpus: 12, total allocated vcpus: 3 {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 23 03:52:15 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Final resource view: name=user phys_ram=16023MB used_ram=896MB phys_disk=40GB used_disk=3GB total_vcpus=12 used_vcpus=3 pci_stats=[] {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 23 03:52:15 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:15 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:15 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:15 user nova-compute[71428]: INFO nova.compute.manager [None req-c1d74b78-c254-4d71-b634-27382b03030a tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Took 1.32 seconds to destroy the instance on the hypervisor. Apr 23 03:52:15 user nova-compute[71428]: DEBUG oslo.service.loopingcall [None req-c1d74b78-c254-4d71-b634-27382b03030a tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71428) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 23 03:52:15 user nova-compute[71428]: DEBUG nova.compute.manager [-] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Deallocating network for instance {{(pid=71428) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 23 03:52:15 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] deallocate_for_instance() {{(pid=71428) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 23 03:52:15 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:52:15 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:52:15 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:15 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:15 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Compute_service record updated for user:user {{(pid=71428) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 23 03:52:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.280s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:15 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:52:15 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Cleaning up deleted instances {{(pid=71428) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 23 03:52:15 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] There are 0 instances to clean {{(pid=71428) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 23 03:52:15 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Updating instance_info_cache with network_info: [] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Acquiring lock "39775e82-0123-41cf-ad97-d3865a796828" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Lock "39775e82-0123-41cf-ad97-d3865a796828" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:16 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Took 0.52 seconds to deallocate network for instance. Apr 23 03:52:16 user nova-compute[71428]: DEBUG nova.compute.manager [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Starting instance... {{(pid=71428) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c1d74b78-c254-4d71-b634-27382b03030a tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c1d74b78-c254-4d71-b634-27382b03030a tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG nova.compute.manager [req-62f96328-d16d-4e06-9b19-d7215266aba7 req-2e5261ea-6d78-4d68-a667-1f0baa3215e4 service nova] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Received event network-vif-deleted-c9e332a6-c2c7-41d3-9646-e5f672d8cb95 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-c1d74b78-c254-4d71-b634-27382b03030a tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-c1d74b78-c254-4d71-b634-27382b03030a tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c1d74b78-c254-4d71-b634-27382b03030a tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.188s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.151s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71428) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 23 03:52:16 user nova-compute[71428]: INFO nova.compute.claims [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Claim successful on node user Apr 23 03:52:16 user nova-compute[71428]: INFO nova.scheduler.client.report [None req-c1d74b78-c254-4d71-b634-27382b03030a tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Deleted allocations for instance 90cce4ec-f48c-408f-8d8f-46414bde01df Apr 23 03:52:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-c1d74b78-c254-4d71-b634-27382b03030a tempest-TestMinimumBasicScenario-1558592168 tempest-TestMinimumBasicScenario-1558592168-project-member] Lock "90cce4ec-f48c-408f-8d8f-46414bde01df" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.204s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG nova.compute.manager [req-5dcf805d-abb1-4964-9340-1c18cf4fca34 req-d836b67d-c59b-4dd6-a3f5-bd108d8308cc service nova] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Received event network-vif-plugged-c9e332a6-c2c7-41d3-9646-e5f672d8cb95 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-5dcf805d-abb1-4964-9340-1c18cf4fca34 req-d836b67d-c59b-4dd6-a3f5-bd108d8308cc service nova] Acquiring lock "90cce4ec-f48c-408f-8d8f-46414bde01df-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-5dcf805d-abb1-4964-9340-1c18cf4fca34 req-d836b67d-c59b-4dd6-a3f5-bd108d8308cc service nova] Lock "90cce4ec-f48c-408f-8d8f-46414bde01df-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-5dcf805d-abb1-4964-9340-1c18cf4fca34 req-d836b67d-c59b-4dd6-a3f5-bd108d8308cc service nova] Lock "90cce4ec-f48c-408f-8d8f-46414bde01df-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG nova.compute.manager [req-5dcf805d-abb1-4964-9340-1c18cf4fca34 req-d836b67d-c59b-4dd6-a3f5-bd108d8308cc service nova] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] No waiting events found dispatching network-vif-plugged-c9e332a6-c2c7-41d3-9646-e5f672d8cb95 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:52:16 user nova-compute[71428]: WARNING nova.compute.manager [req-5dcf805d-abb1-4964-9340-1c18cf4fca34 req-d836b67d-c59b-4dd6-a3f5-bd108d8308cc service nova] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Received unexpected event network-vif-plugged-c9e332a6-c2c7-41d3-9646-e5f672d8cb95 for instance with vm_state deleted and task_state None. Apr 23 03:52:16 user nova-compute[71428]: DEBUG nova.compute.manager [req-5dcf805d-abb1-4964-9340-1c18cf4fca34 req-d836b67d-c59b-4dd6-a3f5-bd108d8308cc service nova] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Received event network-vif-plugged-c9e332a6-c2c7-41d3-9646-e5f672d8cb95 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-5dcf805d-abb1-4964-9340-1c18cf4fca34 req-d836b67d-c59b-4dd6-a3f5-bd108d8308cc service nova] Acquiring lock "90cce4ec-f48c-408f-8d8f-46414bde01df-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-5dcf805d-abb1-4964-9340-1c18cf4fca34 req-d836b67d-c59b-4dd6-a3f5-bd108d8308cc service nova] Lock "90cce4ec-f48c-408f-8d8f-46414bde01df-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-5dcf805d-abb1-4964-9340-1c18cf4fca34 req-d836b67d-c59b-4dd6-a3f5-bd108d8308cc service nova] Lock "90cce4ec-f48c-408f-8d8f-46414bde01df-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG nova.compute.manager [req-5dcf805d-abb1-4964-9340-1c18cf4fca34 req-d836b67d-c59b-4dd6-a3f5-bd108d8308cc service nova] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] No waiting events found dispatching network-vif-plugged-c9e332a6-c2c7-41d3-9646-e5f672d8cb95 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:52:16 user nova-compute[71428]: WARNING nova.compute.manager [req-5dcf805d-abb1-4964-9340-1c18cf4fca34 req-d836b67d-c59b-4dd6-a3f5-bd108d8308cc service nova] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Received unexpected event network-vif-plugged-c9e332a6-c2c7-41d3-9646-e5f672d8cb95 for instance with vm_state deleted and task_state None. Apr 23 03:52:16 user nova-compute[71428]: DEBUG nova.compute.manager [req-5dcf805d-abb1-4964-9340-1c18cf4fca34 req-d836b67d-c59b-4dd6-a3f5-bd108d8308cc service nova] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Received event network-vif-plugged-c9e332a6-c2c7-41d3-9646-e5f672d8cb95 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-5dcf805d-abb1-4964-9340-1c18cf4fca34 req-d836b67d-c59b-4dd6-a3f5-bd108d8308cc service nova] Acquiring lock "90cce4ec-f48c-408f-8d8f-46414bde01df-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-5dcf805d-abb1-4964-9340-1c18cf4fca34 req-d836b67d-c59b-4dd6-a3f5-bd108d8308cc service nova] Lock "90cce4ec-f48c-408f-8d8f-46414bde01df-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-5dcf805d-abb1-4964-9340-1c18cf4fca34 req-d836b67d-c59b-4dd6-a3f5-bd108d8308cc service nova] Lock "90cce4ec-f48c-408f-8d8f-46414bde01df-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG nova.compute.manager [req-5dcf805d-abb1-4964-9340-1c18cf4fca34 req-d836b67d-c59b-4dd6-a3f5-bd108d8308cc service nova] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] No waiting events found dispatching network-vif-plugged-c9e332a6-c2c7-41d3-9646-e5f672d8cb95 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:52:16 user nova-compute[71428]: WARNING nova.compute.manager [req-5dcf805d-abb1-4964-9340-1c18cf4fca34 req-d836b67d-c59b-4dd6-a3f5-bd108d8308cc service nova] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Received unexpected event network-vif-plugged-c9e332a6-c2c7-41d3-9646-e5f672d8cb95 for instance with vm_state deleted and task_state None. Apr 23 03:52:16 user nova-compute[71428]: DEBUG nova.compute.manager [req-5dcf805d-abb1-4964-9340-1c18cf4fca34 req-d836b67d-c59b-4dd6-a3f5-bd108d8308cc service nova] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Received event network-vif-plugged-c9e332a6-c2c7-41d3-9646-e5f672d8cb95 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-5dcf805d-abb1-4964-9340-1c18cf4fca34 req-d836b67d-c59b-4dd6-a3f5-bd108d8308cc service nova] Acquiring lock "90cce4ec-f48c-408f-8d8f-46414bde01df-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-5dcf805d-abb1-4964-9340-1c18cf4fca34 req-d836b67d-c59b-4dd6-a3f5-bd108d8308cc service nova] Lock "90cce4ec-f48c-408f-8d8f-46414bde01df-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-5dcf805d-abb1-4964-9340-1c18cf4fca34 req-d836b67d-c59b-4dd6-a3f5-bd108d8308cc service nova] Lock "90cce4ec-f48c-408f-8d8f-46414bde01df-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG nova.compute.manager [req-5dcf805d-abb1-4964-9340-1c18cf4fca34 req-d836b67d-c59b-4dd6-a3f5-bd108d8308cc service nova] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] No waiting events found dispatching network-vif-plugged-c9e332a6-c2c7-41d3-9646-e5f672d8cb95 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:52:16 user nova-compute[71428]: WARNING nova.compute.manager [req-5dcf805d-abb1-4964-9340-1c18cf4fca34 req-d836b67d-c59b-4dd6-a3f5-bd108d8308cc service nova] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Received unexpected event network-vif-plugged-c9e332a6-c2c7-41d3-9646-e5f672d8cb95 for instance with vm_state deleted and task_state None. Apr 23 03:52:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.225s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG nova.compute.manager [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Start building networks asynchronously for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG nova.compute.manager [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Allocating IP information in the background. {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG nova.network.neutron [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] [instance: 39775e82-0123-41cf-ad97-d3865a796828] allocate_for_instance() {{(pid=71428) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 23 03:52:16 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 23 03:52:16 user nova-compute[71428]: DEBUG nova.compute.manager [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Start building block device mappings for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:52:16 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._sync_power_states {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:52:17 user nova-compute[71428]: WARNING nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] While synchronizing instance power states, found 3 instances in the database and 2 instances on the hypervisor. Apr 23 03:52:17 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Triggering sync for uuid 954f41b2-5c67-41a4-b38b-ebf3ad60cac7 {{(pid=71428) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Triggering sync for uuid c8480a7c-b5de-4f66-a4f0-08fc679b0dfd {{(pid=71428) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Triggering sync for uuid 39775e82-0123-41cf-ad97-d3865a796828 {{(pid=71428) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "954f41b2-5c67-41a4-b38b-ebf3ad60cac7" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "954f41b2-5c67-41a4-b38b-ebf3ad60cac7" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "c8480a7c-b5de-4f66-a4f0-08fc679b0dfd" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "c8480a7c-b5de-4f66-a4f0-08fc679b0dfd" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "39775e82-0123-41cf-ad97-d3865a796828" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG nova.compute.manager [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Start spawning the instance on the hypervisor. {{(pid=71428) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Creating instance directory {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 23 03:52:17 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Creating image(s) Apr 23 03:52:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Acquiring lock "/opt/stack/data/nova/instances/39775e82-0123-41cf-ad97-d3865a796828/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Lock "/opt/stack/data/nova/instances/39775e82-0123-41cf-ad97-d3865a796828/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Lock "/opt/stack/data/nova/instances/39775e82-0123-41cf-ad97-d3865a796828/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "954f41b2-5c67-41a4-b38b-ebf3ad60cac7" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.087s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "c8480a7c-b5de-4f66-a4f0-08fc679b0dfd" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.089s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:52:17 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] VM Stopped (Lifecycle Event) Apr 23 03:52:17 user nova-compute[71428]: DEBUG nova.compute.manager [None req-0565676c-c40a-4b2f-b9e7-bed900b4ea1e None None] [instance: 91756f90-733e-4aa5-9108-d2d8b1d020fe] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.133s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Acquiring lock "935dc3474dddbde110531a31510941caddc0ae83" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.125s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/39775e82-0123-41cf-ad97-d3865a796828/disk 1073741824 {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/39775e82-0123-41cf-ad97-d3865a796828/disk 1073741824" returned: 0 in 0.052s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.184s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG nova.policy [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '405152a49bf14ad68f5c96f767c70b3a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '037d3963711a4727a4def98cfff4da67', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71428) authorize /opt/stack/nova/nova/policy.py:203}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.130s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Checking if we can resize image /opt/stack/data/nova/instances/39775e82-0123-41cf-ad97-d3865a796828/disk. size=1073741824 {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/39775e82-0123-41cf-ad97-d3865a796828/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/39775e82-0123-41cf-ad97-d3865a796828/disk --force-share --output=json" returned: 0 in 0.160s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Cannot resize image /opt/stack/data/nova/instances/39775e82-0123-41cf-ad97-d3865a796828/disk to a smaller size. {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG nova.objects.instance [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Lazy-loading 'migration_context' on Instance uuid 39775e82-0123-41cf-ad97-d3865a796828 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Starting heal instance info cache {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Created local disks {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Ensure instance console log exists: /opt/stack/data/nova/instances/39775e82-0123-41cf-ad97-d3865a796828/console.log {{(pid=71428) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "refresh_cache-c8480a7c-b5de-4f66-a4f0-08fc679b0dfd" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquired lock "refresh_cache-c8480a7c-b5de-4f66-a4f0-08fc679b0dfd" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:52:17 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Forcefully refreshing network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 23 03:52:18 user nova-compute[71428]: DEBUG nova.network.neutron [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Successfully created port: 83fd1d96-1b09-4965-9a3b-a362e9ffc06e {{(pid=71428) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 23 03:52:18 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Updating instance_info_cache with network_info: [{"id": "e7548d21-c865-4648-a8a6-dba748a01e14", "address": "fa:16:3e:08:69:16", "network": {"id": "25de9e6c-4c02-4658-b076-466681d96605", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1644366708-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a79e09b1c4ae4cc5ab11c3e56ee4f0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape7548d21-c8", "ovs_interfaceid": "e7548d21-c865-4648-a8a6-dba748a01e14", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:52:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Releasing lock "refresh_cache-c8480a7c-b5de-4f66-a4f0-08fc679b0dfd" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:52:18 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Updated the network info_cache for instance {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG nova.network.neutron [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Successfully updated port: 83fd1d96-1b09-4965-9a3b-a362e9ffc06e {{(pid=71428) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Acquiring lock "refresh_cache-39775e82-0123-41cf-ad97-d3865a796828" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Acquired lock "refresh_cache-39775e82-0123-41cf-ad97-d3865a796828" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG nova.network.neutron [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Building network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG nova.compute.manager [req-e581d466-eb41-4e7a-b2ab-153aa340f93a req-5aef251f-6b35-41d1-80ee-123eba4195ee service nova] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Received event network-changed-83fd1d96-1b09-4965-9a3b-a362e9ffc06e {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG nova.compute.manager [req-e581d466-eb41-4e7a-b2ab-153aa340f93a req-5aef251f-6b35-41d1-80ee-123eba4195ee service nova] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Refreshing instance network info cache due to event network-changed-83fd1d96-1b09-4965-9a3b-a362e9ffc06e. {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-e581d466-eb41-4e7a-b2ab-153aa340f93a req-5aef251f-6b35-41d1-80ee-123eba4195ee service nova] Acquiring lock "refresh_cache-39775e82-0123-41cf-ad97-d3865a796828" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG nova.network.neutron [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Instance cache missing network info. {{(pid=71428) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG nova.network.neutron [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Updating instance_info_cache with network_info: [{"id": "83fd1d96-1b09-4965-9a3b-a362e9ffc06e", "address": "fa:16:3e:6a:9d:c3", "network": {"id": "b6746973-374e-4be6-9067-1f3d7424fe1d", "bridge": "br-int", "label": "tempest-VolumesActionsTest-376048108-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "037d3963711a4727a4def98cfff4da67", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap83fd1d96-1b", "ovs_interfaceid": "83fd1d96-1b09-4965-9a3b-a362e9ffc06e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Releasing lock "refresh_cache-39775e82-0123-41cf-ad97-d3865a796828" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG nova.compute.manager [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Instance network_info: |[{"id": "83fd1d96-1b09-4965-9a3b-a362e9ffc06e", "address": "fa:16:3e:6a:9d:c3", "network": {"id": "b6746973-374e-4be6-9067-1f3d7424fe1d", "bridge": "br-int", "label": "tempest-VolumesActionsTest-376048108-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "037d3963711a4727a4def98cfff4da67", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap83fd1d96-1b", "ovs_interfaceid": "83fd1d96-1b09-4965-9a3b-a362e9ffc06e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-e581d466-eb41-4e7a-b2ab-153aa340f93a req-5aef251f-6b35-41d1-80ee-123eba4195ee service nova] Acquired lock "refresh_cache-39775e82-0123-41cf-ad97-d3865a796828" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG nova.network.neutron [req-e581d466-eb41-4e7a-b2ab-153aa340f93a req-5aef251f-6b35-41d1-80ee-123eba4195ee service nova] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Refreshing network info cache for port 83fd1d96-1b09-4965-9a3b-a362e9ffc06e {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Start _get_guest_xml network_info=[{"id": "83fd1d96-1b09-4965-9a3b-a362e9ffc06e", "address": "fa:16:3e:6a:9d:c3", "network": {"id": "b6746973-374e-4be6-9067-1f3d7424fe1d", "bridge": "br-int", "label": "tempest-VolumesActionsTest-376048108-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "037d3963711a4727a4def98cfff4da67", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap83fd1d96-1b", "ovs_interfaceid": "83fd1d96-1b09-4965-9a3b-a362e9ffc06e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'image_id': 'e6127373-9931-4277-9458-eceef653ea1e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 23 03:52:19 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:52:19 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:52:19 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71428) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-23T03:43:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=), allow threads: True {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Flavor limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Image limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Flavor pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Image pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Got 1 possible topologies {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:52:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-1320327354',display_name='tempest-VolumesActionsTest-instance-1320327354',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesactionstest-instance-1320327354',id=15,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='037d3963711a4727a4def98cfff4da67',ramdisk_id='',reservation_id='r-d5cw2pd0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesActionsTest-911298328',owner_user_name='tempest-VolumesActionsTest-911298328-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:52:17Z,user_data=None,user_id='405152a49bf14ad68f5c96f767c70b3a',uuid=39775e82-0123-41cf-ad97-d3865a796828,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "83fd1d96-1b09-4965-9a3b-a362e9ffc06e", "address": "fa:16:3e:6a:9d:c3", "network": {"id": "b6746973-374e-4be6-9067-1f3d7424fe1d", "bridge": "br-int", "label": "tempest-VolumesActionsTest-376048108-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "037d3963711a4727a4def98cfff4da67", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap83fd1d96-1b", "ovs_interfaceid": "83fd1d96-1b09-4965-9a3b-a362e9ffc06e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71428) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Converting VIF {"id": "83fd1d96-1b09-4965-9a3b-a362e9ffc06e", "address": "fa:16:3e:6a:9d:c3", "network": {"id": "b6746973-374e-4be6-9067-1f3d7424fe1d", "bridge": "br-int", "label": "tempest-VolumesActionsTest-376048108-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "037d3963711a4727a4def98cfff4da67", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap83fd1d96-1b", "ovs_interfaceid": "83fd1d96-1b09-4965-9a3b-a362e9ffc06e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:9d:c3,bridge_name='br-int',has_traffic_filtering=True,id=83fd1d96-1b09-4965-9a3b-a362e9ffc06e,network=Network(b6746973-374e-4be6-9067-1f3d7424fe1d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83fd1d96-1b') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG nova.objects.instance [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Lazy-loading 'pci_devices' on Instance uuid 39775e82-0123-41cf-ad97-d3865a796828 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] [instance: 39775e82-0123-41cf-ad97-d3865a796828] End _get_guest_xml xml= Apr 23 03:52:19 user nova-compute[71428]: 39775e82-0123-41cf-ad97-d3865a796828 Apr 23 03:52:19 user nova-compute[71428]: instance-0000000f Apr 23 03:52:19 user nova-compute[71428]: 131072 Apr 23 03:52:19 user nova-compute[71428]: 1 Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: tempest-VolumesActionsTest-instance-1320327354 Apr 23 03:52:19 user nova-compute[71428]: 2023-04-23 03:52:19 Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: 128 Apr 23 03:52:19 user nova-compute[71428]: 1 Apr 23 03:52:19 user nova-compute[71428]: 0 Apr 23 03:52:19 user nova-compute[71428]: 0 Apr 23 03:52:19 user nova-compute[71428]: 1 Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: tempest-VolumesActionsTest-911298328-project-member Apr 23 03:52:19 user nova-compute[71428]: tempest-VolumesActionsTest-911298328 Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: OpenStack Foundation Apr 23 03:52:19 user nova-compute[71428]: OpenStack Nova Apr 23 03:52:19 user nova-compute[71428]: 0.0.0 Apr 23 03:52:19 user nova-compute[71428]: 39775e82-0123-41cf-ad97-d3865a796828 Apr 23 03:52:19 user nova-compute[71428]: 39775e82-0123-41cf-ad97-d3865a796828 Apr 23 03:52:19 user nova-compute[71428]: Virtual Machine Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: hvm Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Nehalem Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: /dev/urandom Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: Apr 23 03:52:19 user nova-compute[71428]: {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:52:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-1320327354',display_name='tempest-VolumesActionsTest-instance-1320327354',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesactionstest-instance-1320327354',id=15,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='037d3963711a4727a4def98cfff4da67',ramdisk_id='',reservation_id='r-d5cw2pd0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesActionsTest-911298328',owner_user_name='tempest-VolumesActionsTest-911298328-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:52:17Z,user_data=None,user_id='405152a49bf14ad68f5c96f767c70b3a',uuid=39775e82-0123-41cf-ad97-d3865a796828,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "83fd1d96-1b09-4965-9a3b-a362e9ffc06e", "address": "fa:16:3e:6a:9d:c3", "network": {"id": "b6746973-374e-4be6-9067-1f3d7424fe1d", "bridge": "br-int", "label": "tempest-VolumesActionsTest-376048108-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "037d3963711a4727a4def98cfff4da67", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap83fd1d96-1b", "ovs_interfaceid": "83fd1d96-1b09-4965-9a3b-a362e9ffc06e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Converting VIF {"id": "83fd1d96-1b09-4965-9a3b-a362e9ffc06e", "address": "fa:16:3e:6a:9d:c3", "network": {"id": "b6746973-374e-4be6-9067-1f3d7424fe1d", "bridge": "br-int", "label": "tempest-VolumesActionsTest-376048108-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "037d3963711a4727a4def98cfff4da67", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap83fd1d96-1b", "ovs_interfaceid": "83fd1d96-1b09-4965-9a3b-a362e9ffc06e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:9d:c3,bridge_name='br-int',has_traffic_filtering=True,id=83fd1d96-1b09-4965-9a3b-a362e9ffc06e,network=Network(b6746973-374e-4be6-9067-1f3d7424fe1d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83fd1d96-1b') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG os_vif [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:9d:c3,bridge_name='br-int',has_traffic_filtering=True,id=83fd1d96-1b09-4965-9a3b-a362e9ffc06e,network=Network(b6746973-374e-4be6-9067-1f3d7424fe1d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83fd1d96-1b') {{(pid=71428) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap83fd1d96-1b, may_exist=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap83fd1d96-1b, col_values=(('external_ids', {'iface-id': '83fd1d96-1b09-4965-9a3b-a362e9ffc06e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6a:9d:c3', 'vm-uuid': '39775e82-0123-41cf-ad97-d3865a796828'}),)) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:19 user nova-compute[71428]: INFO os_vif [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:9d:c3,bridge_name='br-int',has_traffic_filtering=True,id=83fd1d96-1b09-4965-9a3b-a362e9ffc06e,network=Network(b6746973-374e-4be6-9067-1f3d7424fe1d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83fd1d96-1b') Apr 23 03:52:19 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] No BDM found with device name vda, not building metadata. {{(pid=71428) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 23 03:52:19 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] No VIF found with MAC fa:16:3e:6a:9d:c3, not building metadata {{(pid=71428) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 23 03:52:20 user nova-compute[71428]: DEBUG nova.network.neutron [req-e581d466-eb41-4e7a-b2ab-153aa340f93a req-5aef251f-6b35-41d1-80ee-123eba4195ee service nova] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Updated VIF entry in instance network info cache for port 83fd1d96-1b09-4965-9a3b-a362e9ffc06e. {{(pid=71428) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 23 03:52:20 user nova-compute[71428]: DEBUG nova.network.neutron [req-e581d466-eb41-4e7a-b2ab-153aa340f93a req-5aef251f-6b35-41d1-80ee-123eba4195ee service nova] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Updating instance_info_cache with network_info: [{"id": "83fd1d96-1b09-4965-9a3b-a362e9ffc06e", "address": "fa:16:3e:6a:9d:c3", "network": {"id": "b6746973-374e-4be6-9067-1f3d7424fe1d", "bridge": "br-int", "label": "tempest-VolumesActionsTest-376048108-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "037d3963711a4727a4def98cfff4da67", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap83fd1d96-1b", "ovs_interfaceid": "83fd1d96-1b09-4965-9a3b-a362e9ffc06e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:52:20 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-e581d466-eb41-4e7a-b2ab-153aa340f93a req-5aef251f-6b35-41d1-80ee-123eba4195ee service nova] Releasing lock "refresh_cache-39775e82-0123-41cf-ad97-d3865a796828" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:52:20 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:21 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:21 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:21 user nova-compute[71428]: DEBUG nova.compute.manager [req-6e2afdcd-e107-47b1-838e-90d2dc2ffb1b req-18451572-65ee-477a-8621-53366ba8f5f2 service nova] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Received event network-vif-plugged-83fd1d96-1b09-4965-9a3b-a362e9ffc06e {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:52:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-6e2afdcd-e107-47b1-838e-90d2dc2ffb1b req-18451572-65ee-477a-8621-53366ba8f5f2 service nova] Acquiring lock "39775e82-0123-41cf-ad97-d3865a796828-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-6e2afdcd-e107-47b1-838e-90d2dc2ffb1b req-18451572-65ee-477a-8621-53366ba8f5f2 service nova] Lock "39775e82-0123-41cf-ad97-d3865a796828-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-6e2afdcd-e107-47b1-838e-90d2dc2ffb1b req-18451572-65ee-477a-8621-53366ba8f5f2 service nova] Lock "39775e82-0123-41cf-ad97-d3865a796828-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:21 user nova-compute[71428]: DEBUG nova.compute.manager [req-6e2afdcd-e107-47b1-838e-90d2dc2ffb1b req-18451572-65ee-477a-8621-53366ba8f5f2 service nova] [instance: 39775e82-0123-41cf-ad97-d3865a796828] No waiting events found dispatching network-vif-plugged-83fd1d96-1b09-4965-9a3b-a362e9ffc06e {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:52:21 user nova-compute[71428]: WARNING nova.compute.manager [req-6e2afdcd-e107-47b1-838e-90d2dc2ffb1b req-18451572-65ee-477a-8621-53366ba8f5f2 service nova] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Received unexpected event network-vif-plugged-83fd1d96-1b09-4965-9a3b-a362e9ffc06e for instance with vm_state building and task_state spawning. Apr 23 03:52:23 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Resumed> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:52:23 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 39775e82-0123-41cf-ad97-d3865a796828] VM Resumed (Lifecycle Event) Apr 23 03:52:23 user nova-compute[71428]: DEBUG nova.compute.manager [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Instance event wait completed in 0 seconds for {{(pid=71428) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 23 03:52:23 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Guest created on hypervisor {{(pid=71428) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 23 03:52:23 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Instance spawned successfully. Apr 23 03:52:23 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 23 03:52:23 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:52:23 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:52:23 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Found default for hw_cdrom_bus of ide {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:52:23 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Found default for hw_disk_bus of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:52:23 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Found default for hw_input_bus of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:52:23 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Found default for hw_pointer_model of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:52:23 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Found default for hw_video_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:52:23 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Found default for hw_vif_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:52:23 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 39775e82-0123-41cf-ad97-d3865a796828] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 03:52:23 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Started> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:52:23 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 39775e82-0123-41cf-ad97-d3865a796828] VM Started (Lifecycle Event) Apr 23 03:52:23 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:52:23 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:52:23 user nova-compute[71428]: DEBUG nova.compute.manager [req-0183b798-c888-4367-8e4f-c4c873cca202 req-63b312cc-da1b-4f2e-af2a-2e5266d16db7 service nova] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Received event network-vif-plugged-83fd1d96-1b09-4965-9a3b-a362e9ffc06e {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:52:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-0183b798-c888-4367-8e4f-c4c873cca202 req-63b312cc-da1b-4f2e-af2a-2e5266d16db7 service nova] Acquiring lock "39775e82-0123-41cf-ad97-d3865a796828-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-0183b798-c888-4367-8e4f-c4c873cca202 req-63b312cc-da1b-4f2e-af2a-2e5266d16db7 service nova] Lock "39775e82-0123-41cf-ad97-d3865a796828-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.002s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-0183b798-c888-4367-8e4f-c4c873cca202 req-63b312cc-da1b-4f2e-af2a-2e5266d16db7 service nova] Lock "39775e82-0123-41cf-ad97-d3865a796828-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:23 user nova-compute[71428]: DEBUG nova.compute.manager [req-0183b798-c888-4367-8e4f-c4c873cca202 req-63b312cc-da1b-4f2e-af2a-2e5266d16db7 service nova] [instance: 39775e82-0123-41cf-ad97-d3865a796828] No waiting events found dispatching network-vif-plugged-83fd1d96-1b09-4965-9a3b-a362e9ffc06e {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:52:23 user nova-compute[71428]: WARNING nova.compute.manager [req-0183b798-c888-4367-8e4f-c4c873cca202 req-63b312cc-da1b-4f2e-af2a-2e5266d16db7 service nova] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Received unexpected event network-vif-plugged-83fd1d96-1b09-4965-9a3b-a362e9ffc06e for instance with vm_state building and task_state spawning. Apr 23 03:52:23 user nova-compute[71428]: INFO nova.compute.manager [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Took 6.20 seconds to spawn the instance on the hypervisor. Apr 23 03:52:23 user nova-compute[71428]: DEBUG nova.compute.manager [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:52:23 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 39775e82-0123-41cf-ad97-d3865a796828] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 03:52:23 user nova-compute[71428]: INFO nova.compute.manager [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Took 6.89 seconds to build instance. Apr 23 03:52:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-540910f9-ac4d-4ce2-b497-db34a0f9be61 tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Lock "39775e82-0123-41cf-ad97-d3865a796828" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.977s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "39775e82-0123-41cf-ad97-d3865a796828" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 6.339s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:23 user nova-compute[71428]: INFO nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: 39775e82-0123-41cf-ad97-d3865a796828] During sync_power_state the instance has a pending task (block_device_mapping). Skip. Apr 23 03:52:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "39775e82-0123-41cf-ad97-d3865a796828" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:24 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:29 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:52:29 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:29 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71428) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 23 03:52:29 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 03:52:29 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 03:52:29 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:30 user nova-compute[71428]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:52:30 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] VM Stopped (Lifecycle Event) Apr 23 03:52:30 user nova-compute[71428]: DEBUG nova.compute.manager [None req-a1dca0e2-0b29-4df3-b5f1-a0369b979f1d None None] [instance: 90cce4ec-f48c-408f-8d8f-46414bde01df] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:52:34 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:52:34 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:34 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71428) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 23 03:52:34 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 03:52:34 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 03:52:34 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:39 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:43 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:44 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:48 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:49 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:51 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Acquiring lock "c21ec3a3-0761-4543-ad43-dca2913d159b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:51 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "c21ec3a3-0761-4543-ad43-dca2913d159b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:51 user nova-compute[71428]: DEBUG nova.compute.manager [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Starting instance... {{(pid=71428) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 23 03:52:51 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:51 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:51 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71428) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 23 03:52:51 user nova-compute[71428]: INFO nova.compute.claims [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Claim successful on node user Apr 23 03:52:51 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Refreshing inventories for resource provider 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 23 03:52:51 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Updating ProviderTree inventory for provider 3017e09c-9289-4a8e-8061-3ff90149e985 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 23 03:52:51 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Updating inventory in ProviderTree for provider 3017e09c-9289-4a8e-8061-3ff90149e985 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 23 03:52:51 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Refreshing aggregate associations for resource provider 3017e09c-9289-4a8e-8061-3ff90149e985, aggregates: None {{(pid=71428) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 23 03:52:51 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Refreshing trait associations for resource provider 3017e09c-9289-4a8e-8061-3ff90149e985, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE41,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_RESCUE_BFV,COMPUTE_NODE,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_GRAPHICS_MODEL_CIRRUS {{(pid=71428) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 23 03:52:52 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:52:52 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:52:52 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.363s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:52 user nova-compute[71428]: DEBUG nova.compute.manager [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Start building networks asynchronously for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 23 03:52:52 user nova-compute[71428]: DEBUG nova.compute.manager [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Allocating IP information in the background. {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 23 03:52:52 user nova-compute[71428]: DEBUG nova.network.neutron [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] allocate_for_instance() {{(pid=71428) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 23 03:52:52 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 23 03:52:52 user nova-compute[71428]: DEBUG nova.compute.manager [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Start building block device mappings for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 23 03:52:52 user nova-compute[71428]: DEBUG nova.policy [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0a2c459cad014b07b2613e5e261d88aa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '24fff486a500421397ecb935828582cd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71428) authorize /opt/stack/nova/nova/policy.py:203}} Apr 23 03:52:52 user nova-compute[71428]: DEBUG nova.compute.manager [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Start spawning the instance on the hypervisor. {{(pid=71428) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 23 03:52:52 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Creating instance directory {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 23 03:52:52 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Creating image(s) Apr 23 03:52:52 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Acquiring lock "/opt/stack/data/nova/instances/c21ec3a3-0761-4543-ad43-dca2913d159b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:52 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "/opt/stack/data/nova/instances/c21ec3a3-0761-4543-ad43-dca2913d159b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:52 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "/opt/stack/data/nova/instances/c21ec3a3-0761-4543-ad43-dca2913d159b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:52 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:52:52 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.139s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:52:52 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Acquiring lock "935dc3474dddbde110531a31510941caddc0ae83" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:52 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:52 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:52:52 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.150s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:52:52 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/c21ec3a3-0761-4543-ad43-dca2913d159b/disk 1073741824 {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:52:52 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/c21ec3a3-0761-4543-ad43-dca2913d159b/disk 1073741824" returned: 0 in 0.064s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:52:52 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.220s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:52 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:52:52 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.136s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:52:52 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Checking if we can resize image /opt/stack/data/nova/instances/c21ec3a3-0761-4543-ad43-dca2913d159b/disk. size=1073741824 {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 23 03:52:52 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c21ec3a3-0761-4543-ad43-dca2913d159b/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:52:52 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c21ec3a3-0761-4543-ad43-dca2913d159b/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:52:52 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Cannot resize image /opt/stack/data/nova/instances/c21ec3a3-0761-4543-ad43-dca2913d159b/disk to a smaller size. {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 23 03:52:52 user nova-compute[71428]: DEBUG nova.objects.instance [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lazy-loading 'migration_context' on Instance uuid c21ec3a3-0761-4543-ad43-dca2913d159b {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:52:52 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Created local disks {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 23 03:52:52 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Ensure instance console log exists: /opt/stack/data/nova/instances/c21ec3a3-0761-4543-ad43-dca2913d159b/console.log {{(pid=71428) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 23 03:52:52 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:52 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:52 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:53 user nova-compute[71428]: DEBUG nova.network.neutron [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Successfully created port: 3cc621aa-a5f6-46de-bfd8-32dada7ca345 {{(pid=71428) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG nova.network.neutron [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Successfully updated port: 3cc621aa-a5f6-46de-bfd8-32dada7ca345 {{(pid=71428) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Acquiring lock "refresh_cache-c21ec3a3-0761-4543-ad43-dca2913d159b" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Acquired lock "refresh_cache-c21ec3a3-0761-4543-ad43-dca2913d159b" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG nova.network.neutron [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Building network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG nova.compute.manager [req-bd458a0b-827e-434b-a0ad-925ccd341ee3 req-2fca7edd-d1f3-4840-9910-5545e3aa7d46 service nova] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Received event network-changed-3cc621aa-a5f6-46de-bfd8-32dada7ca345 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG nova.compute.manager [req-bd458a0b-827e-434b-a0ad-925ccd341ee3 req-2fca7edd-d1f3-4840-9910-5545e3aa7d46 service nova] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Refreshing instance network info cache due to event network-changed-3cc621aa-a5f6-46de-bfd8-32dada7ca345. {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-bd458a0b-827e-434b-a0ad-925ccd341ee3 req-2fca7edd-d1f3-4840-9910-5545e3aa7d46 service nova] Acquiring lock "refresh_cache-c21ec3a3-0761-4543-ad43-dca2913d159b" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG nova.network.neutron [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Instance cache missing network info. {{(pid=71428) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71428) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG nova.network.neutron [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Updating instance_info_cache with network_info: [{"id": "3cc621aa-a5f6-46de-bfd8-32dada7ca345", "address": "fa:16:3e:e6:30:15", "network": {"id": "6adb189c-9b46-4da4-a34b-31edf8685498", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-759967402-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "24fff486a500421397ecb935828582cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cc621aa-a5", "ovs_interfaceid": "3cc621aa-a5f6-46de-bfd8-32dada7ca345", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Releasing lock "refresh_cache-c21ec3a3-0761-4543-ad43-dca2913d159b" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG nova.compute.manager [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Instance network_info: |[{"id": "3cc621aa-a5f6-46de-bfd8-32dada7ca345", "address": "fa:16:3e:e6:30:15", "network": {"id": "6adb189c-9b46-4da4-a34b-31edf8685498", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-759967402-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "24fff486a500421397ecb935828582cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cc621aa-a5", "ovs_interfaceid": "3cc621aa-a5f6-46de-bfd8-32dada7ca345", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-bd458a0b-827e-434b-a0ad-925ccd341ee3 req-2fca7edd-d1f3-4840-9910-5545e3aa7d46 service nova] Acquired lock "refresh_cache-c21ec3a3-0761-4543-ad43-dca2913d159b" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG nova.network.neutron [req-bd458a0b-827e-434b-a0ad-925ccd341ee3 req-2fca7edd-d1f3-4840-9910-5545e3aa7d46 service nova] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Refreshing network info cache for port 3cc621aa-a5f6-46de-bfd8-32dada7ca345 {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Start _get_guest_xml network_info=[{"id": "3cc621aa-a5f6-46de-bfd8-32dada7ca345", "address": "fa:16:3e:e6:30:15", "network": {"id": "6adb189c-9b46-4da4-a34b-31edf8685498", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-759967402-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "24fff486a500421397ecb935828582cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cc621aa-a5", "ovs_interfaceid": "3cc621aa-a5f6-46de-bfd8-32dada7ca345", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'image_id': 'e6127373-9931-4277-9458-eceef653ea1e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 23 03:52:54 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:52:54 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:52:54 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71428) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-23T03:43:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=), allow threads: True {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Flavor limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Image limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Flavor pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Image pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Got 1 possible topologies {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:52:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-159009824',display_name='tempest-AttachVolumeNegativeTest-server-159009824',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-159009824',id=16,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIs7ux48o2nnGrdeOCvQDySG+P6nXioDJdGS6oeGudA0JHf7ZhHzg403SyXAde6YT5+5r+rVL2ZP2ipzilgOZRPzvYzNtjeBNGfwBpGBza8LD5gUqZ8keZPqVn3njSi8NQ==',key_name='tempest-keypair-325673571',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='24fff486a500421397ecb935828582cd',ramdisk_id='',reservation_id='r-ne1tzaqb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-636753786',owner_user_name='tempest-AttachVolumeNegativeTest-636753786-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:52:52Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0a2c459cad014b07b2613e5e261d88aa',uuid=c21ec3a3-0761-4543-ad43-dca2913d159b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3cc621aa-a5f6-46de-bfd8-32dada7ca345", "address": "fa:16:3e:e6:30:15", "network": {"id": "6adb189c-9b46-4da4-a34b-31edf8685498", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-759967402-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "24fff486a500421397ecb935828582cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cc621aa-a5", "ovs_interfaceid": "3cc621aa-a5f6-46de-bfd8-32dada7ca345", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71428) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Converting VIF {"id": "3cc621aa-a5f6-46de-bfd8-32dada7ca345", "address": "fa:16:3e:e6:30:15", "network": {"id": "6adb189c-9b46-4da4-a34b-31edf8685498", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-759967402-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "24fff486a500421397ecb935828582cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cc621aa-a5", "ovs_interfaceid": "3cc621aa-a5f6-46de-bfd8-32dada7ca345", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e6:30:15,bridge_name='br-int',has_traffic_filtering=True,id=3cc621aa-a5f6-46de-bfd8-32dada7ca345,network=Network(6adb189c-9b46-4da4-a34b-31edf8685498),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cc621aa-a5') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG nova.objects.instance [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lazy-loading 'pci_devices' on Instance uuid c21ec3a3-0761-4543-ad43-dca2913d159b {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] End _get_guest_xml xml= Apr 23 03:52:54 user nova-compute[71428]: c21ec3a3-0761-4543-ad43-dca2913d159b Apr 23 03:52:54 user nova-compute[71428]: instance-00000010 Apr 23 03:52:54 user nova-compute[71428]: 131072 Apr 23 03:52:54 user nova-compute[71428]: 1 Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: tempest-AttachVolumeNegativeTest-server-159009824 Apr 23 03:52:54 user nova-compute[71428]: 2023-04-23 03:52:54 Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: 128 Apr 23 03:52:54 user nova-compute[71428]: 1 Apr 23 03:52:54 user nova-compute[71428]: 0 Apr 23 03:52:54 user nova-compute[71428]: 0 Apr 23 03:52:54 user nova-compute[71428]: 1 Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: tempest-AttachVolumeNegativeTest-636753786-project-member Apr 23 03:52:54 user nova-compute[71428]: tempest-AttachVolumeNegativeTest-636753786 Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: OpenStack Foundation Apr 23 03:52:54 user nova-compute[71428]: OpenStack Nova Apr 23 03:52:54 user nova-compute[71428]: 0.0.0 Apr 23 03:52:54 user nova-compute[71428]: c21ec3a3-0761-4543-ad43-dca2913d159b Apr 23 03:52:54 user nova-compute[71428]: c21ec3a3-0761-4543-ad43-dca2913d159b Apr 23 03:52:54 user nova-compute[71428]: Virtual Machine Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: hvm Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Nehalem Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: /dev/urandom Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: Apr 23 03:52:54 user nova-compute[71428]: {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:52:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-159009824',display_name='tempest-AttachVolumeNegativeTest-server-159009824',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-159009824',id=16,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIs7ux48o2nnGrdeOCvQDySG+P6nXioDJdGS6oeGudA0JHf7ZhHzg403SyXAde6YT5+5r+rVL2ZP2ipzilgOZRPzvYzNtjeBNGfwBpGBza8LD5gUqZ8keZPqVn3njSi8NQ==',key_name='tempest-keypair-325673571',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='24fff486a500421397ecb935828582cd',ramdisk_id='',reservation_id='r-ne1tzaqb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-636753786',owner_user_name='tempest-AttachVolumeNegativeTest-636753786-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:52:52Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0a2c459cad014b07b2613e5e261d88aa',uuid=c21ec3a3-0761-4543-ad43-dca2913d159b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3cc621aa-a5f6-46de-bfd8-32dada7ca345", "address": "fa:16:3e:e6:30:15", "network": {"id": "6adb189c-9b46-4da4-a34b-31edf8685498", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-759967402-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "24fff486a500421397ecb935828582cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cc621aa-a5", "ovs_interfaceid": "3cc621aa-a5f6-46de-bfd8-32dada7ca345", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Converting VIF {"id": "3cc621aa-a5f6-46de-bfd8-32dada7ca345", "address": "fa:16:3e:e6:30:15", "network": {"id": "6adb189c-9b46-4da4-a34b-31edf8685498", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-759967402-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "24fff486a500421397ecb935828582cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cc621aa-a5", "ovs_interfaceid": "3cc621aa-a5f6-46de-bfd8-32dada7ca345", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e6:30:15,bridge_name='br-int',has_traffic_filtering=True,id=3cc621aa-a5f6-46de-bfd8-32dada7ca345,network=Network(6adb189c-9b46-4da4-a34b-31edf8685498),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cc621aa-a5') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG os_vif [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e6:30:15,bridge_name='br-int',has_traffic_filtering=True,id=3cc621aa-a5f6-46de-bfd8-32dada7ca345,network=Network(6adb189c-9b46-4da4-a34b-31edf8685498),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cc621aa-a5') {{(pid=71428) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3cc621aa-a5, may_exist=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3cc621aa-a5, col_values=(('external_ids', {'iface-id': '3cc621aa-a5f6-46de-bfd8-32dada7ca345', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e6:30:15', 'vm-uuid': 'c21ec3a3-0761-4543-ad43-dca2913d159b'}),)) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:52:54 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:54 user nova-compute[71428]: INFO os_vif [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e6:30:15,bridge_name='br-int',has_traffic_filtering=True,id=3cc621aa-a5f6-46de-bfd8-32dada7ca345,network=Network(6adb189c-9b46-4da4-a34b-31edf8685498),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cc621aa-a5') Apr 23 03:52:55 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] No BDM found with device name vda, not building metadata. {{(pid=71428) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 23 03:52:55 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] No VIF found with MAC fa:16:3e:e6:30:15, not building metadata {{(pid=71428) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 23 03:52:55 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:55 user nova-compute[71428]: DEBUG nova.network.neutron [req-bd458a0b-827e-434b-a0ad-925ccd341ee3 req-2fca7edd-d1f3-4840-9910-5545e3aa7d46 service nova] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Updated VIF entry in instance network info cache for port 3cc621aa-a5f6-46de-bfd8-32dada7ca345. {{(pid=71428) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 23 03:52:55 user nova-compute[71428]: DEBUG nova.network.neutron [req-bd458a0b-827e-434b-a0ad-925ccd341ee3 req-2fca7edd-d1f3-4840-9910-5545e3aa7d46 service nova] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Updating instance_info_cache with network_info: [{"id": "3cc621aa-a5f6-46de-bfd8-32dada7ca345", "address": "fa:16:3e:e6:30:15", "network": {"id": "6adb189c-9b46-4da4-a34b-31edf8685498", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-759967402-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "24fff486a500421397ecb935828582cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cc621aa-a5", "ovs_interfaceid": "3cc621aa-a5f6-46de-bfd8-32dada7ca345", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:52:55 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-bd458a0b-827e-434b-a0ad-925ccd341ee3 req-2fca7edd-d1f3-4840-9910-5545e3aa7d46 service nova] Releasing lock "refresh_cache-c21ec3a3-0761-4543-ad43-dca2913d159b" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:52:55 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:56 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:56 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:56 user nova-compute[71428]: DEBUG nova.compute.manager [req-f2c383ec-709e-4127-80c5-47a2f6ff7f03 req-20da1690-a8e9-413d-a7d1-ce452aad81d3 service nova] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Received event network-vif-plugged-3cc621aa-a5f6-46de-bfd8-32dada7ca345 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:52:56 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-f2c383ec-709e-4127-80c5-47a2f6ff7f03 req-20da1690-a8e9-413d-a7d1-ce452aad81d3 service nova] Acquiring lock "c21ec3a3-0761-4543-ad43-dca2913d159b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:56 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-f2c383ec-709e-4127-80c5-47a2f6ff7f03 req-20da1690-a8e9-413d-a7d1-ce452aad81d3 service nova] Lock "c21ec3a3-0761-4543-ad43-dca2913d159b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:56 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-f2c383ec-709e-4127-80c5-47a2f6ff7f03 req-20da1690-a8e9-413d-a7d1-ce452aad81d3 service nova] Lock "c21ec3a3-0761-4543-ad43-dca2913d159b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:56 user nova-compute[71428]: DEBUG nova.compute.manager [req-f2c383ec-709e-4127-80c5-47a2f6ff7f03 req-20da1690-a8e9-413d-a7d1-ce452aad81d3 service nova] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] No waiting events found dispatching network-vif-plugged-3cc621aa-a5f6-46de-bfd8-32dada7ca345 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:52:56 user nova-compute[71428]: WARNING nova.compute.manager [req-f2c383ec-709e-4127-80c5-47a2f6ff7f03 req-20da1690-a8e9-413d-a7d1-ce452aad81d3 service nova] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Received unexpected event network-vif-plugged-3cc621aa-a5f6-46de-bfd8-32dada7ca345 for instance with vm_state building and task_state spawning. Apr 23 03:52:56 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:56 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:56 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:56 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:58 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Resumed> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:52:58 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] VM Resumed (Lifecycle Event) Apr 23 03:52:58 user nova-compute[71428]: DEBUG nova.compute.manager [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Instance event wait completed in 0 seconds for {{(pid=71428) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 23 03:52:58 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Guest created on hypervisor {{(pid=71428) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 23 03:52:58 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Instance spawned successfully. Apr 23 03:52:58 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 23 03:52:58 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Found default for hw_cdrom_bus of ide {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:52:58 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Found default for hw_disk_bus of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:52:58 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Found default for hw_input_bus of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:52:58 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Found default for hw_pointer_model of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:52:58 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Found default for hw_video_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:52:58 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Found default for hw_vif_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:52:58 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:52:58 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:52:58 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 03:52:58 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Started> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:52:58 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] VM Started (Lifecycle Event) Apr 23 03:52:58 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:52:58 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:52:58 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 03:52:58 user nova-compute[71428]: DEBUG nova.compute.manager [req-bd3bb624-238d-4b76-9b2e-f987cef352d2 req-c997de20-3c8f-4996-9d8b-3b2cac5e391a service nova] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Received event network-vif-plugged-3cc621aa-a5f6-46de-bfd8-32dada7ca345 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:52:58 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-bd3bb624-238d-4b76-9b2e-f987cef352d2 req-c997de20-3c8f-4996-9d8b-3b2cac5e391a service nova] Acquiring lock "c21ec3a3-0761-4543-ad43-dca2913d159b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:58 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-bd3bb624-238d-4b76-9b2e-f987cef352d2 req-c997de20-3c8f-4996-9d8b-3b2cac5e391a service nova] Lock "c21ec3a3-0761-4543-ad43-dca2913d159b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:58 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-bd3bb624-238d-4b76-9b2e-f987cef352d2 req-c997de20-3c8f-4996-9d8b-3b2cac5e391a service nova] Lock "c21ec3a3-0761-4543-ad43-dca2913d159b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:58 user nova-compute[71428]: DEBUG nova.compute.manager [req-bd3bb624-238d-4b76-9b2e-f987cef352d2 req-c997de20-3c8f-4996-9d8b-3b2cac5e391a service nova] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] No waiting events found dispatching network-vif-plugged-3cc621aa-a5f6-46de-bfd8-32dada7ca345 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:52:58 user nova-compute[71428]: WARNING nova.compute.manager [req-bd3bb624-238d-4b76-9b2e-f987cef352d2 req-c997de20-3c8f-4996-9d8b-3b2cac5e391a service nova] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Received unexpected event network-vif-plugged-3cc621aa-a5f6-46de-bfd8-32dada7ca345 for instance with vm_state building and task_state spawning. Apr 23 03:52:58 user nova-compute[71428]: INFO nova.compute.manager [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Took 6.42 seconds to spawn the instance on the hypervisor. Apr 23 03:52:58 user nova-compute[71428]: DEBUG nova.compute.manager [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:52:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:58 user nova-compute[71428]: INFO nova.compute.manager [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Took 7.12 seconds to build instance. Apr 23 03:52:58 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-064b620c-6fea-439a-a32e-c927ccd765de tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "c21ec3a3-0761-4543-ad43-dca2913d159b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.225s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:59 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Acquiring lock "97b28eb9-634c-4f9b-97ef-41d9f45f19b7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:59 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "97b28eb9-634c-4f9b-97ef-41d9f45f19b7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:59 user nova-compute[71428]: DEBUG nova.compute.manager [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Starting instance... {{(pid=71428) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 23 03:52:59 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:59 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:59 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71428) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 23 03:52:59 user nova-compute[71428]: INFO nova.compute.claims [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Claim successful on node user Apr 23 03:52:59 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:52:59 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:52:59 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.341s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:59 user nova-compute[71428]: DEBUG nova.compute.manager [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Start building networks asynchronously for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 23 03:52:59 user nova-compute[71428]: DEBUG nova.compute.manager [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Allocating IP information in the background. {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 23 03:52:59 user nova-compute[71428]: DEBUG nova.network.neutron [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] allocate_for_instance() {{(pid=71428) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 23 03:52:59 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 23 03:52:59 user nova-compute[71428]: DEBUG nova.compute.manager [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Start building block device mappings for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 23 03:52:59 user nova-compute[71428]: DEBUG nova.compute.manager [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Start spawning the instance on the hypervisor. {{(pid=71428) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 23 03:52:59 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Creating instance directory {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 23 03:52:59 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Creating image(s) Apr 23 03:52:59 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Acquiring lock "/opt/stack/data/nova/instances/97b28eb9-634c-4f9b-97ef-41d9f45f19b7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:59 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "/opt/stack/data/nova/instances/97b28eb9-634c-4f9b-97ef-41d9f45f19b7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:59 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "/opt/stack/data/nova/instances/97b28eb9-634c-4f9b-97ef-41d9f45f19b7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:52:59 user nova-compute[71428]: DEBUG nova.policy [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '99495726467944c38620831fc93e2856', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a79e09b1c4ae4cc5ab11c3e56ee4f0d9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71428) authorize /opt/stack/nova/nova/policy.py:203}} Apr 23 03:52:59 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:52:59 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:52:59 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.150s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:52:59 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Acquiring lock "935dc3474dddbde110531a31510941caddc0ae83" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:52:59 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.003s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:52:59 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:53:00 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.188s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:53:00 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/97b28eb9-634c-4f9b-97ef-41d9f45f19b7/disk 1073741824 {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:53:00 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/97b28eb9-634c-4f9b-97ef-41d9f45f19b7/disk 1073741824" returned: 0 in 0.057s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:53:00 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.254s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:53:00 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:53:00 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.151s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:53:00 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Checking if we can resize image /opt/stack/data/nova/instances/97b28eb9-634c-4f9b-97ef-41d9f45f19b7/disk. size=1073741824 {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 23 03:53:00 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97b28eb9-634c-4f9b-97ef-41d9f45f19b7/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:53:00 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97b28eb9-634c-4f9b-97ef-41d9f45f19b7/disk --force-share --output=json" returned: 0 in 0.145s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:53:00 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Cannot resize image /opt/stack/data/nova/instances/97b28eb9-634c-4f9b-97ef-41d9f45f19b7/disk to a smaller size. {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 23 03:53:00 user nova-compute[71428]: DEBUG nova.objects.instance [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lazy-loading 'migration_context' on Instance uuid 97b28eb9-634c-4f9b-97ef-41d9f45f19b7 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:53:00 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Created local disks {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 23 03:53:00 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Ensure instance console log exists: /opt/stack/data/nova/instances/97b28eb9-634c-4f9b-97ef-41d9f45f19b7/console.log {{(pid=71428) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 23 03:53:00 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:53:00 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:53:00 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:53:00 user nova-compute[71428]: DEBUG nova.network.neutron [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Successfully created port: 6d65b2b7-2a9a-4e2b-b003-af767e8a544f {{(pid=71428) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG nova.network.neutron [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Successfully updated port: 6d65b2b7-2a9a-4e2b-b003-af767e8a544f {{(pid=71428) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Acquiring lock "refresh_cache-97b28eb9-634c-4f9b-97ef-41d9f45f19b7" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Acquired lock "refresh_cache-97b28eb9-634c-4f9b-97ef-41d9f45f19b7" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG nova.network.neutron [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Building network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG nova.compute.manager [req-a6c8e6e5-0601-45e8-a081-1719293612f0 req-14b1316a-96ee-4fb8-81f7-ca54f71a4f70 service nova] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Received event network-changed-6d65b2b7-2a9a-4e2b-b003-af767e8a544f {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG nova.compute.manager [req-a6c8e6e5-0601-45e8-a081-1719293612f0 req-14b1316a-96ee-4fb8-81f7-ca54f71a4f70 service nova] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Refreshing instance network info cache due to event network-changed-6d65b2b7-2a9a-4e2b-b003-af767e8a544f. {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-a6c8e6e5-0601-45e8-a081-1719293612f0 req-14b1316a-96ee-4fb8-81f7-ca54f71a4f70 service nova] Acquiring lock "refresh_cache-97b28eb9-634c-4f9b-97ef-41d9f45f19b7" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG nova.network.neutron [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Instance cache missing network info. {{(pid=71428) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG nova.network.neutron [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Updating instance_info_cache with network_info: [{"id": "6d65b2b7-2a9a-4e2b-b003-af767e8a544f", "address": "fa:16:3e:a0:05:16", "network": {"id": "25de9e6c-4c02-4658-b076-466681d96605", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1644366708-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a79e09b1c4ae4cc5ab11c3e56ee4f0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d65b2b7-2a", "ovs_interfaceid": "6d65b2b7-2a9a-4e2b-b003-af767e8a544f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Releasing lock "refresh_cache-97b28eb9-634c-4f9b-97ef-41d9f45f19b7" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG nova.compute.manager [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Instance network_info: |[{"id": "6d65b2b7-2a9a-4e2b-b003-af767e8a544f", "address": "fa:16:3e:a0:05:16", "network": {"id": "25de9e6c-4c02-4658-b076-466681d96605", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1644366708-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a79e09b1c4ae4cc5ab11c3e56ee4f0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d65b2b7-2a", "ovs_interfaceid": "6d65b2b7-2a9a-4e2b-b003-af767e8a544f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-a6c8e6e5-0601-45e8-a081-1719293612f0 req-14b1316a-96ee-4fb8-81f7-ca54f71a4f70 service nova] Acquired lock "refresh_cache-97b28eb9-634c-4f9b-97ef-41d9f45f19b7" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG nova.network.neutron [req-a6c8e6e5-0601-45e8-a081-1719293612f0 req-14b1316a-96ee-4fb8-81f7-ca54f71a4f70 service nova] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Refreshing network info cache for port 6d65b2b7-2a9a-4e2b-b003-af767e8a544f {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Start _get_guest_xml network_info=[{"id": "6d65b2b7-2a9a-4e2b-b003-af767e8a544f", "address": "fa:16:3e:a0:05:16", "network": {"id": "25de9e6c-4c02-4658-b076-466681d96605", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1644366708-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a79e09b1c4ae4cc5ab11c3e56ee4f0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d65b2b7-2a", "ovs_interfaceid": "6d65b2b7-2a9a-4e2b-b003-af767e8a544f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'image_id': 'e6127373-9931-4277-9458-eceef653ea1e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 23 03:53:01 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:53:01 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:53:01 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71428) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-23T03:43:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=), allow threads: True {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Flavor limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Image limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Flavor pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Image pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Got 1 possible topologies {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:52:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1016623574',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1016623574',id=17,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a79e09b1c4ae4cc5ab11c3e56ee4f0d9',ramdisk_id='',reservation_id='r-y1kg0tql',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-653032906',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:53:00Z,user_data=None,user_id='99495726467944c38620831fc93e2856',uuid=97b28eb9-634c-4f9b-97ef-41d9f45f19b7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6d65b2b7-2a9a-4e2b-b003-af767e8a544f", "address": "fa:16:3e:a0:05:16", "network": {"id": "25de9e6c-4c02-4658-b076-466681d96605", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1644366708-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a79e09b1c4ae4cc5ab11c3e56ee4f0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d65b2b7-2a", "ovs_interfaceid": "6d65b2b7-2a9a-4e2b-b003-af767e8a544f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71428) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Converting VIF {"id": "6d65b2b7-2a9a-4e2b-b003-af767e8a544f", "address": "fa:16:3e:a0:05:16", "network": {"id": "25de9e6c-4c02-4658-b076-466681d96605", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1644366708-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a79e09b1c4ae4cc5ab11c3e56ee4f0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d65b2b7-2a", "ovs_interfaceid": "6d65b2b7-2a9a-4e2b-b003-af767e8a544f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:05:16,bridge_name='br-int',has_traffic_filtering=True,id=6d65b2b7-2a9a-4e2b-b003-af767e8a544f,network=Network(25de9e6c-4c02-4658-b076-466681d96605),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d65b2b7-2a') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG nova.objects.instance [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lazy-loading 'pci_devices' on Instance uuid 97b28eb9-634c-4f9b-97ef-41d9f45f19b7 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] End _get_guest_xml xml= Apr 23 03:53:01 user nova-compute[71428]: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7 Apr 23 03:53:01 user nova-compute[71428]: instance-00000011 Apr 23 03:53:01 user nova-compute[71428]: 131072 Apr 23 03:53:01 user nova-compute[71428]: 1 Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: tempest-ServerBootFromVolumeStableRescueTest-server-1016623574 Apr 23 03:53:01 user nova-compute[71428]: 2023-04-23 03:53:01 Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: 128 Apr 23 03:53:01 user nova-compute[71428]: 1 Apr 23 03:53:01 user nova-compute[71428]: 0 Apr 23 03:53:01 user nova-compute[71428]: 0 Apr 23 03:53:01 user nova-compute[71428]: 1 Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member Apr 23 03:53:01 user nova-compute[71428]: tempest-ServerBootFromVolumeStableRescueTest-653032906 Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: OpenStack Foundation Apr 23 03:53:01 user nova-compute[71428]: OpenStack Nova Apr 23 03:53:01 user nova-compute[71428]: 0.0.0 Apr 23 03:53:01 user nova-compute[71428]: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7 Apr 23 03:53:01 user nova-compute[71428]: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7 Apr 23 03:53:01 user nova-compute[71428]: Virtual Machine Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: hvm Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Nehalem Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: /dev/urandom Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: Apr 23 03:53:01 user nova-compute[71428]: {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:52:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1016623574',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1016623574',id=17,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a79e09b1c4ae4cc5ab11c3e56ee4f0d9',ramdisk_id='',reservation_id='r-y1kg0tql',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-653032906',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:53:00Z,user_data=None,user_id='99495726467944c38620831fc93e2856',uuid=97b28eb9-634c-4f9b-97ef-41d9f45f19b7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6d65b2b7-2a9a-4e2b-b003-af767e8a544f", "address": "fa:16:3e:a0:05:16", "network": {"id": "25de9e6c-4c02-4658-b076-466681d96605", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1644366708-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a79e09b1c4ae4cc5ab11c3e56ee4f0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d65b2b7-2a", "ovs_interfaceid": "6d65b2b7-2a9a-4e2b-b003-af767e8a544f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Converting VIF {"id": "6d65b2b7-2a9a-4e2b-b003-af767e8a544f", "address": "fa:16:3e:a0:05:16", "network": {"id": "25de9e6c-4c02-4658-b076-466681d96605", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1644366708-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a79e09b1c4ae4cc5ab11c3e56ee4f0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d65b2b7-2a", "ovs_interfaceid": "6d65b2b7-2a9a-4e2b-b003-af767e8a544f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:05:16,bridge_name='br-int',has_traffic_filtering=True,id=6d65b2b7-2a9a-4e2b-b003-af767e8a544f,network=Network(25de9e6c-4c02-4658-b076-466681d96605),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d65b2b7-2a') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG os_vif [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:05:16,bridge_name='br-int',has_traffic_filtering=True,id=6d65b2b7-2a9a-4e2b-b003-af767e8a544f,network=Network(25de9e6c-4c02-4658-b076-466681d96605),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d65b2b7-2a') {{(pid=71428) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6d65b2b7-2a, may_exist=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6d65b2b7-2a, col_values=(('external_ids', {'iface-id': '6d65b2b7-2a9a-4e2b-b003-af767e8a544f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a0:05:16', 'vm-uuid': '97b28eb9-634c-4f9b-97ef-41d9f45f19b7'}),)) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:53:01 user nova-compute[71428]: INFO os_vif [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:05:16,bridge_name='br-int',has_traffic_filtering=True,id=6d65b2b7-2a9a-4e2b-b003-af767e8a544f,network=Network(25de9e6c-4c02-4658-b076-466681d96605),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d65b2b7-2a') Apr 23 03:53:01 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] No BDM found with device name vda, not building metadata. {{(pid=71428) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 23 03:53:01 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] No VIF found with MAC fa:16:3e:a0:05:16, not building metadata {{(pid=71428) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 23 03:53:02 user nova-compute[71428]: DEBUG nova.network.neutron [req-a6c8e6e5-0601-45e8-a081-1719293612f0 req-14b1316a-96ee-4fb8-81f7-ca54f71a4f70 service nova] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Updated VIF entry in instance network info cache for port 6d65b2b7-2a9a-4e2b-b003-af767e8a544f. {{(pid=71428) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 23 03:53:02 user nova-compute[71428]: DEBUG nova.network.neutron [req-a6c8e6e5-0601-45e8-a081-1719293612f0 req-14b1316a-96ee-4fb8-81f7-ca54f71a4f70 service nova] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Updating instance_info_cache with network_info: [{"id": "6d65b2b7-2a9a-4e2b-b003-af767e8a544f", "address": "fa:16:3e:a0:05:16", "network": {"id": "25de9e6c-4c02-4658-b076-466681d96605", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1644366708-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a79e09b1c4ae4cc5ab11c3e56ee4f0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d65b2b7-2a", "ovs_interfaceid": "6d65b2b7-2a9a-4e2b-b003-af767e8a544f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:53:02 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-a6c8e6e5-0601-45e8-a081-1719293612f0 req-14b1316a-96ee-4fb8-81f7-ca54f71a4f70 service nova] Releasing lock "refresh_cache-97b28eb9-634c-4f9b-97ef-41d9f45f19b7" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:53:03 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:53:03 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:53:03 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:53:03 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:53:03 user nova-compute[71428]: DEBUG nova.compute.manager [req-34dae454-273e-4295-9348-2e966d4d130b req-a15e374a-ce9d-4433-918f-80d3a237a14a service nova] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Received event network-vif-plugged-6d65b2b7-2a9a-4e2b-b003-af767e8a544f {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:53:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-34dae454-273e-4295-9348-2e966d4d130b req-a15e374a-ce9d-4433-918f-80d3a237a14a service nova] Acquiring lock "97b28eb9-634c-4f9b-97ef-41d9f45f19b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:53:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-34dae454-273e-4295-9348-2e966d4d130b req-a15e374a-ce9d-4433-918f-80d3a237a14a service nova] Lock "97b28eb9-634c-4f9b-97ef-41d9f45f19b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:53:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-34dae454-273e-4295-9348-2e966d4d130b req-a15e374a-ce9d-4433-918f-80d3a237a14a service nova] Lock "97b28eb9-634c-4f9b-97ef-41d9f45f19b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:53:03 user nova-compute[71428]: DEBUG nova.compute.manager [req-34dae454-273e-4295-9348-2e966d4d130b req-a15e374a-ce9d-4433-918f-80d3a237a14a service nova] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] No waiting events found dispatching network-vif-plugged-6d65b2b7-2a9a-4e2b-b003-af767e8a544f {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:53:03 user nova-compute[71428]: WARNING nova.compute.manager [req-34dae454-273e-4295-9348-2e966d4d130b req-a15e374a-ce9d-4433-918f-80d3a237a14a service nova] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Received unexpected event network-vif-plugged-6d65b2b7-2a9a-4e2b-b003-af767e8a544f for instance with vm_state building and task_state spawning. Apr 23 03:53:03 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:53:05 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Resumed> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:53:05 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] VM Resumed (Lifecycle Event) Apr 23 03:53:05 user nova-compute[71428]: DEBUG nova.compute.manager [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Instance event wait completed in 0 seconds for {{(pid=71428) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 23 03:53:05 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Guest created on hypervisor {{(pid=71428) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 23 03:53:05 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Instance spawned successfully. Apr 23 03:53:05 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 23 03:53:05 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:53:05 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:53:05 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Found default for hw_cdrom_bus of ide {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:53:05 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Found default for hw_disk_bus of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:53:05 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Found default for hw_input_bus of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:53:05 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Found default for hw_pointer_model of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:53:05 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Found default for hw_video_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:53:05 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Found default for hw_vif_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:53:05 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 03:53:05 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Started> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:53:05 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] VM Started (Lifecycle Event) Apr 23 03:53:05 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:53:05 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:53:05 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 03:53:05 user nova-compute[71428]: INFO nova.compute.manager [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Took 5.77 seconds to spawn the instance on the hypervisor. Apr 23 03:53:05 user nova-compute[71428]: DEBUG nova.compute.manager [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:53:05 user nova-compute[71428]: DEBUG nova.compute.manager [req-17390351-652e-4d19-958f-d731b3836ac7 req-d3baacb9-9d23-4cf3-9033-6ca5ea7834df service nova] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Received event network-vif-plugged-6d65b2b7-2a9a-4e2b-b003-af767e8a544f {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:53:05 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-17390351-652e-4d19-958f-d731b3836ac7 req-d3baacb9-9d23-4cf3-9033-6ca5ea7834df service nova] Acquiring lock "97b28eb9-634c-4f9b-97ef-41d9f45f19b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:53:05 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-17390351-652e-4d19-958f-d731b3836ac7 req-d3baacb9-9d23-4cf3-9033-6ca5ea7834df service nova] Lock "97b28eb9-634c-4f9b-97ef-41d9f45f19b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.002s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:53:05 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-17390351-652e-4d19-958f-d731b3836ac7 req-d3baacb9-9d23-4cf3-9033-6ca5ea7834df service nova] Lock "97b28eb9-634c-4f9b-97ef-41d9f45f19b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:53:05 user nova-compute[71428]: DEBUG nova.compute.manager [req-17390351-652e-4d19-958f-d731b3836ac7 req-d3baacb9-9d23-4cf3-9033-6ca5ea7834df service nova] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] No waiting events found dispatching network-vif-plugged-6d65b2b7-2a9a-4e2b-b003-af767e8a544f {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:53:05 user nova-compute[71428]: WARNING nova.compute.manager [req-17390351-652e-4d19-958f-d731b3836ac7 req-d3baacb9-9d23-4cf3-9033-6ca5ea7834df service nova] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Received unexpected event network-vif-plugged-6d65b2b7-2a9a-4e2b-b003-af767e8a544f for instance with vm_state building and task_state spawning. Apr 23 03:53:05 user nova-compute[71428]: INFO nova.compute.manager [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Took 6.45 seconds to build instance. Apr 23 03:53:05 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-ae9ce82a-0fab-4494-a89b-9156bb332704 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "97b28eb9-634c-4f9b-97ef-41d9f45f19b7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.541s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:53:06 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:53:07 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:53:08 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:53:09 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:53:09 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:53:11 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:53:11 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:53:13 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:53:13 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:53:13 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71428) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 23 03:53:13 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:53:15 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:53:15 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:53:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:53:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:53:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:53:15 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Auditing locally available compute resources for user (node: user) {{(pid=71428) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 23 03:53:15 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97b28eb9-634c-4f9b-97ef-41d9f45f19b7/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:53:16 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97b28eb9-634c-4f9b-97ef-41d9f45f19b7/disk --force-share --output=json" returned: 0 in 0.160s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:53:16 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97b28eb9-634c-4f9b-97ef-41d9f45f19b7/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:53:16 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97b28eb9-634c-4f9b-97ef-41d9f45f19b7/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:53:16 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/39775e82-0123-41cf-ad97-d3865a796828/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:53:16 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/39775e82-0123-41cf-ad97-d3865a796828/disk --force-share --output=json" returned: 0 in 0.145s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:53:16 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/39775e82-0123-41cf-ad97-d3865a796828/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:53:16 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/39775e82-0123-41cf-ad97-d3865a796828/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:53:16 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:53:16 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json" returned: 0 in 0.155s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:53:16 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:53:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:53:16 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:53:16 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:53:17 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/disk --force-share --output=json" returned: 0 in 0.155s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:53:17 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:53:17 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:53:17 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c21ec3a3-0761-4543-ad43-dca2913d159b/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:53:17 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c21ec3a3-0761-4543-ad43-dca2913d159b/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:53:17 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c21ec3a3-0761-4543-ad43-dca2913d159b/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:53:17 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c21ec3a3-0761-4543-ad43-dca2913d159b/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:53:18 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:53:18 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:53:18 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:53:18 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Hypervisor/Node resource view: name=user free_ram=8463MB free_disk=26.226459503173828GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71428) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 23 03:53:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:53:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:53:18 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 954f41b2-5c67-41a4-b38b-ebf3ad60cac7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:53:18 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance c8480a7c-b5de-4f66-a4f0-08fc679b0dfd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:53:18 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 39775e82-0123-41cf-ad97-d3865a796828 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:53:18 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance c21ec3a3-0761-4543-ad43-dca2913d159b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:53:18 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 97b28eb9-634c-4f9b-97ef-41d9f45f19b7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:53:18 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Total usable vcpus: 12, total allocated vcpus: 5 {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 23 03:53:18 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Final resource view: name=user phys_ram=16023MB used_ram=1152MB phys_disk=40GB used_disk=5GB total_vcpus=12 used_vcpus=5 pci_stats=[] {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 23 03:53:18 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:53:18 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:53:18 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Compute_service record updated for user:user {{(pid=71428) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 23 03:53:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.376s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:53:18 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:53:19 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:53:19 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:53:19 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:53:19 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Starting heal instance info cache {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 23 03:53:19 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Didn't find any instances for network info cache update. {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 23 03:53:20 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:53:21 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:53:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:53:25 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:53:26 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:53:27 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:53:28 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:53:31 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:53:33 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:53:36 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:53:36 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:53:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:53:39 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:53:41 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:53:46 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:53:46 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:53:46 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71428) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 23 03:53:46 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 03:53:46 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 03:53:46 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:53:48 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:53:51 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:53:53 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:53:56 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:54:01 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:54:01 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:54:01 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71428) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 23 03:54:01 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 03:54:01 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 03:54:01 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:54:06 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:54:08 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-5e1dca6a-52fe-4724-a03c-b7cfd9f25e9c tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Acquiring lock "39775e82-0123-41cf-ad97-d3865a796828" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:54:08 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-5e1dca6a-52fe-4724-a03c-b7cfd9f25e9c tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Lock "39775e82-0123-41cf-ad97-d3865a796828" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:54:08 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-5e1dca6a-52fe-4724-a03c-b7cfd9f25e9c tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Acquiring lock "39775e82-0123-41cf-ad97-d3865a796828-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:54:08 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-5e1dca6a-52fe-4724-a03c-b7cfd9f25e9c tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Lock "39775e82-0123-41cf-ad97-d3865a796828-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:54:08 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-5e1dca6a-52fe-4724-a03c-b7cfd9f25e9c tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Lock "39775e82-0123-41cf-ad97-d3865a796828-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:54:08 user nova-compute[71428]: INFO nova.compute.manager [None req-5e1dca6a-52fe-4724-a03c-b7cfd9f25e9c tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Terminating instance Apr 23 03:54:08 user nova-compute[71428]: DEBUG nova.compute.manager [None req-5e1dca6a-52fe-4724-a03c-b7cfd9f25e9c tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Start destroying the instance on the hypervisor. {{(pid=71428) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 23 03:54:08 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:54:08 user nova-compute[71428]: DEBUG nova.compute.manager [req-8ac0c576-a104-4a6c-8df1-bb72756cae49 req-54a0ae2a-f976-41de-bd65-fc93980d7280 service nova] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Received event network-vif-unplugged-83fd1d96-1b09-4965-9a3b-a362e9ffc06e {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:54:08 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-8ac0c576-a104-4a6c-8df1-bb72756cae49 req-54a0ae2a-f976-41de-bd65-fc93980d7280 service nova] Acquiring lock "39775e82-0123-41cf-ad97-d3865a796828-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:54:08 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-8ac0c576-a104-4a6c-8df1-bb72756cae49 req-54a0ae2a-f976-41de-bd65-fc93980d7280 service nova] Lock "39775e82-0123-41cf-ad97-d3865a796828-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:54:08 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-8ac0c576-a104-4a6c-8df1-bb72756cae49 req-54a0ae2a-f976-41de-bd65-fc93980d7280 service nova] Lock "39775e82-0123-41cf-ad97-d3865a796828-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:54:08 user nova-compute[71428]: DEBUG nova.compute.manager [req-8ac0c576-a104-4a6c-8df1-bb72756cae49 req-54a0ae2a-f976-41de-bd65-fc93980d7280 service nova] [instance: 39775e82-0123-41cf-ad97-d3865a796828] No waiting events found dispatching network-vif-unplugged-83fd1d96-1b09-4965-9a3b-a362e9ffc06e {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:54:08 user nova-compute[71428]: DEBUG nova.compute.manager [req-8ac0c576-a104-4a6c-8df1-bb72756cae49 req-54a0ae2a-f976-41de-bd65-fc93980d7280 service nova] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Received event network-vif-unplugged-83fd1d96-1b09-4965-9a3b-a362e9ffc06e for instance with task_state deleting. {{(pid=71428) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 23 03:54:08 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Instance destroyed successfully. Apr 23 03:54:08 user nova-compute[71428]: DEBUG nova.objects.instance [None req-5e1dca6a-52fe-4724-a03c-b7cfd9f25e9c tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Lazy-loading 'resources' on Instance uuid 39775e82-0123-41cf-ad97-d3865a796828 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:54:08 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-5e1dca6a-52fe-4724-a03c-b7cfd9f25e9c tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:52:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-1320327354',display_name='tempest-VolumesActionsTest-instance-1320327354',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesactionstest-instance-1320327354',id=15,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-23T03:52:23Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='037d3963711a4727a4def98cfff4da67',ramdisk_id='',reservation_id='r-d5cw2pd0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-VolumesActionsTest-911298328',owner_user_name='tempest-VolumesActionsTest-911298328-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-23T03:52:23Z,user_data=None,user_id='405152a49bf14ad68f5c96f767c70b3a',uuid=39775e82-0123-41cf-ad97-d3865a796828,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "83fd1d96-1b09-4965-9a3b-a362e9ffc06e", "address": "fa:16:3e:6a:9d:c3", "network": {"id": "b6746973-374e-4be6-9067-1f3d7424fe1d", "bridge": "br-int", "label": "tempest-VolumesActionsTest-376048108-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "037d3963711a4727a4def98cfff4da67", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap83fd1d96-1b", "ovs_interfaceid": "83fd1d96-1b09-4965-9a3b-a362e9ffc06e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 23 03:54:08 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-5e1dca6a-52fe-4724-a03c-b7cfd9f25e9c tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Converting VIF {"id": "83fd1d96-1b09-4965-9a3b-a362e9ffc06e", "address": "fa:16:3e:6a:9d:c3", "network": {"id": "b6746973-374e-4be6-9067-1f3d7424fe1d", "bridge": "br-int", "label": "tempest-VolumesActionsTest-376048108-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "037d3963711a4727a4def98cfff4da67", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap83fd1d96-1b", "ovs_interfaceid": "83fd1d96-1b09-4965-9a3b-a362e9ffc06e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:54:08 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-5e1dca6a-52fe-4724-a03c-b7cfd9f25e9c tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:9d:c3,bridge_name='br-int',has_traffic_filtering=True,id=83fd1d96-1b09-4965-9a3b-a362e9ffc06e,network=Network(b6746973-374e-4be6-9067-1f3d7424fe1d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83fd1d96-1b') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:54:08 user nova-compute[71428]: DEBUG os_vif [None req-5e1dca6a-52fe-4724-a03c-b7cfd9f25e9c tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:9d:c3,bridge_name='br-int',has_traffic_filtering=True,id=83fd1d96-1b09-4965-9a3b-a362e9ffc06e,network=Network(b6746973-374e-4be6-9067-1f3d7424fe1d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83fd1d96-1b') {{(pid=71428) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 23 03:54:08 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:54:08 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap83fd1d96-1b, bridge=br-int, if_exists=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:54:08 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:54:08 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:54:08 user nova-compute[71428]: INFO os_vif [None req-5e1dca6a-52fe-4724-a03c-b7cfd9f25e9c tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:9d:c3,bridge_name='br-int',has_traffic_filtering=True,id=83fd1d96-1b09-4965-9a3b-a362e9ffc06e,network=Network(b6746973-374e-4be6-9067-1f3d7424fe1d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83fd1d96-1b') Apr 23 03:54:08 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-5e1dca6a-52fe-4724-a03c-b7cfd9f25e9c tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Deleting instance files /opt/stack/data/nova/instances/39775e82-0123-41cf-ad97-d3865a796828_del Apr 23 03:54:08 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-5e1dca6a-52fe-4724-a03c-b7cfd9f25e9c tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Deletion of /opt/stack/data/nova/instances/39775e82-0123-41cf-ad97-d3865a796828_del complete Apr 23 03:54:08 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:54:08 user nova-compute[71428]: INFO nova.compute.manager [None req-5e1dca6a-52fe-4724-a03c-b7cfd9f25e9c tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Took 0.66 seconds to destroy the instance on the hypervisor. Apr 23 03:54:08 user nova-compute[71428]: DEBUG oslo.service.loopingcall [None req-5e1dca6a-52fe-4724-a03c-b7cfd9f25e9c tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71428) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 23 03:54:08 user nova-compute[71428]: DEBUG nova.compute.manager [-] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Deallocating network for instance {{(pid=71428) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 23 03:54:08 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: 39775e82-0123-41cf-ad97-d3865a796828] deallocate_for_instance() {{(pid=71428) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 23 03:54:09 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Updating instance_info_cache with network_info: [] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:54:09 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Took 0.68 seconds to deallocate network for instance. Apr 23 03:54:09 user nova-compute[71428]: DEBUG nova.compute.manager [req-0ad2d0cb-c2d2-4ba5-850a-093b36c41505 req-f09fd772-45b4-4240-899e-7967b08eb091 service nova] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Received event network-vif-deleted-83fd1d96-1b09-4965-9a3b-a362e9ffc06e {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:54:09 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-5e1dca6a-52fe-4724-a03c-b7cfd9f25e9c tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:54:09 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-5e1dca6a-52fe-4724-a03c-b7cfd9f25e9c tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:54:09 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-5e1dca6a-52fe-4724-a03c-b7cfd9f25e9c tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:54:09 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-5e1dca6a-52fe-4724-a03c-b7cfd9f25e9c tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:54:09 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-5e1dca6a-52fe-4724-a03c-b7cfd9f25e9c tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.195s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:54:09 user nova-compute[71428]: INFO nova.scheduler.client.report [None req-5e1dca6a-52fe-4724-a03c-b7cfd9f25e9c tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Deleted allocations for instance 39775e82-0123-41cf-ad97-d3865a796828 Apr 23 03:54:09 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-5e1dca6a-52fe-4724-a03c-b7cfd9f25e9c tempest-VolumesActionsTest-911298328 tempest-VolumesActionsTest-911298328-project-member] Lock "39775e82-0123-41cf-ad97-d3865a796828" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.722s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:54:10 user nova-compute[71428]: DEBUG nova.compute.manager [req-9664da0e-a0a3-4391-bd1d-fa1cef4e8253 req-b49b525a-926d-473a-bc7b-032497a90e43 service nova] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Received event network-vif-plugged-83fd1d96-1b09-4965-9a3b-a362e9ffc06e {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:54:10 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-9664da0e-a0a3-4391-bd1d-fa1cef4e8253 req-b49b525a-926d-473a-bc7b-032497a90e43 service nova] Acquiring lock "39775e82-0123-41cf-ad97-d3865a796828-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:54:10 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-9664da0e-a0a3-4391-bd1d-fa1cef4e8253 req-b49b525a-926d-473a-bc7b-032497a90e43 service nova] Lock "39775e82-0123-41cf-ad97-d3865a796828-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:54:10 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-9664da0e-a0a3-4391-bd1d-fa1cef4e8253 req-b49b525a-926d-473a-bc7b-032497a90e43 service nova] Lock "39775e82-0123-41cf-ad97-d3865a796828-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:54:10 user nova-compute[71428]: DEBUG nova.compute.manager [req-9664da0e-a0a3-4391-bd1d-fa1cef4e8253 req-b49b525a-926d-473a-bc7b-032497a90e43 service nova] [instance: 39775e82-0123-41cf-ad97-d3865a796828] No waiting events found dispatching network-vif-plugged-83fd1d96-1b09-4965-9a3b-a362e9ffc06e {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:54:10 user nova-compute[71428]: WARNING nova.compute.manager [req-9664da0e-a0a3-4391-bd1d-fa1cef4e8253 req-b49b525a-926d-473a-bc7b-032497a90e43 service nova] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Received unexpected event network-vif-plugged-83fd1d96-1b09-4965-9a3b-a362e9ffc06e for instance with vm_state deleted and task_state None. Apr 23 03:54:10 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:54:10 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:54:13 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:54:13 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:54:13 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71428) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 23 03:54:13 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:54:15 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:54:16 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:54:17 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:54:17 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:54:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:54:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:54:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:54:17 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Auditing locally available compute resources for user (node: user) {{(pid=71428) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 23 03:54:17 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97b28eb9-634c-4f9b-97ef-41d9f45f19b7/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:54:17 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97b28eb9-634c-4f9b-97ef-41d9f45f19b7/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:54:17 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97b28eb9-634c-4f9b-97ef-41d9f45f19b7/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:54:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97b28eb9-634c-4f9b-97ef-41d9f45f19b7/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:54:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:54:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json" returned: 0 in 0.146s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:54:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:54:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:54:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:54:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:54:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:54:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:54:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c21ec3a3-0761-4543-ad43-dca2913d159b/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:54:18 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:54:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c21ec3a3-0761-4543-ad43-dca2913d159b/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:54:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c21ec3a3-0761-4543-ad43-dca2913d159b/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:54:18 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:54:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c21ec3a3-0761-4543-ad43-dca2913d159b/disk --force-share --output=json" returned: 0 in 0.145s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:54:19 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:54:19 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:54:19 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Hypervisor/Node resource view: name=user free_ram=8723MB free_disk=26.221790313720703GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71428) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 23 03:54:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:54:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:54:19 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 954f41b2-5c67-41a4-b38b-ebf3ad60cac7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:54:19 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance c8480a7c-b5de-4f66-a4f0-08fc679b0dfd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:54:19 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance c21ec3a3-0761-4543-ad43-dca2913d159b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:54:19 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 97b28eb9-634c-4f9b-97ef-41d9f45f19b7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:54:19 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Total usable vcpus: 12, total allocated vcpus: 4 {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 23 03:54:19 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Final resource view: name=user phys_ram=16023MB used_ram=1024MB phys_disk=40GB used_disk=4GB total_vcpus=12 used_vcpus=4 pci_stats=[] {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 23 03:54:19 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:54:19 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:54:19 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Compute_service record updated for user:user {{(pid=71428) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 23 03:54:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.294s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:54:20 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:54:20 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Starting heal instance info cache {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 23 03:54:20 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Rebuilding the list of instances to heal {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 23 03:54:20 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "refresh_cache-954f41b2-5c67-41a4-b38b-ebf3ad60cac7" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:54:20 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquired lock "refresh_cache-954f41b2-5c67-41a4-b38b-ebf3ad60cac7" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:54:20 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Forcefully refreshing network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 23 03:54:20 user nova-compute[71428]: DEBUG nova.objects.instance [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lazy-loading 'info_cache' on Instance uuid 954f41b2-5c67-41a4-b38b-ebf3ad60cac7 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:54:21 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Updating instance_info_cache with network_info: [{"id": "97d945a1-b86e-4a6d-af52-23084b8eb175", "address": "fa:16:3e:ee:63:58", "network": {"id": "bc639aed-946f-458c-b08d-ce3037f5507c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-558421264-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6f44fe9ba07e4e2f925d1a8952356e04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap97d945a1-b8", "ovs_interfaceid": "97d945a1-b86e-4a6d-af52-23084b8eb175", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:54:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Releasing lock "refresh_cache-954f41b2-5c67-41a4-b38b-ebf3ad60cac7" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:54:21 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Updated the network info_cache for instance {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 23 03:54:21 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:54:23 user nova-compute[71428]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:54:23 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: 39775e82-0123-41cf-ad97-d3865a796828] VM Stopped (Lifecycle Event) Apr 23 03:54:23 user nova-compute[71428]: DEBUG nova.compute.manager [None req-b0e71aac-e60d-4832-b713-1126c6b837c9 None None] [instance: 39775e82-0123-41cf-ad97-d3865a796828] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:54:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:54:28 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:54:28 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:54:28 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71428) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 23 03:54:28 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 03:54:28 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 03:54:28 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:54:33 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:54:33 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:54:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:54:42 user nova-compute[71428]: DEBUG nova.compute.manager [req-db7d8669-aa3d-4869-88f7-b4b3d2e48876 req-20600193-58cf-4e6d-94cd-869768c9ba96 service nova] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Received event network-changed-3cc621aa-a5f6-46de-bfd8-32dada7ca345 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:54:42 user nova-compute[71428]: DEBUG nova.compute.manager [req-db7d8669-aa3d-4869-88f7-b4b3d2e48876 req-20600193-58cf-4e6d-94cd-869768c9ba96 service nova] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Refreshing instance network info cache due to event network-changed-3cc621aa-a5f6-46de-bfd8-32dada7ca345. {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 23 03:54:42 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-db7d8669-aa3d-4869-88f7-b4b3d2e48876 req-20600193-58cf-4e6d-94cd-869768c9ba96 service nova] Acquiring lock "refresh_cache-c21ec3a3-0761-4543-ad43-dca2913d159b" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:54:42 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-db7d8669-aa3d-4869-88f7-b4b3d2e48876 req-20600193-58cf-4e6d-94cd-869768c9ba96 service nova] Acquired lock "refresh_cache-c21ec3a3-0761-4543-ad43-dca2913d159b" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:54:42 user nova-compute[71428]: DEBUG nova.network.neutron [req-db7d8669-aa3d-4869-88f7-b4b3d2e48876 req-20600193-58cf-4e6d-94cd-869768c9ba96 service nova] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Refreshing network info cache for port 3cc621aa-a5f6-46de-bfd8-32dada7ca345 {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 23 03:54:43 user nova-compute[71428]: DEBUG nova.network.neutron [req-db7d8669-aa3d-4869-88f7-b4b3d2e48876 req-20600193-58cf-4e6d-94cd-869768c9ba96 service nova] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Updated VIF entry in instance network info cache for port 3cc621aa-a5f6-46de-bfd8-32dada7ca345. {{(pid=71428) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 23 03:54:43 user nova-compute[71428]: DEBUG nova.network.neutron [req-db7d8669-aa3d-4869-88f7-b4b3d2e48876 req-20600193-58cf-4e6d-94cd-869768c9ba96 service nova] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Updating instance_info_cache with network_info: [{"id": "3cc621aa-a5f6-46de-bfd8-32dada7ca345", "address": "fa:16:3e:e6:30:15", "network": {"id": "6adb189c-9b46-4da4-a34b-31edf8685498", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-759967402-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.154", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "24fff486a500421397ecb935828582cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cc621aa-a5", "ovs_interfaceid": "3cc621aa-a5f6-46de-bfd8-32dada7ca345", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:54:43 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-db7d8669-aa3d-4869-88f7-b4b3d2e48876 req-20600193-58cf-4e6d-94cd-869768c9ba96 service nova] Releasing lock "refresh_cache-c21ec3a3-0761-4543-ad43-dca2913d159b" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:54:43 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:54:43 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:54:44 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-b04a8a65-28ce-46b7-8c67-fd9c7f298b50 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Acquiring lock "c21ec3a3-0761-4543-ad43-dca2913d159b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:54:44 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-b04a8a65-28ce-46b7-8c67-fd9c7f298b50 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "c21ec3a3-0761-4543-ad43-dca2913d159b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:54:44 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-b04a8a65-28ce-46b7-8c67-fd9c7f298b50 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Acquiring lock "c21ec3a3-0761-4543-ad43-dca2913d159b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:54:44 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-b04a8a65-28ce-46b7-8c67-fd9c7f298b50 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "c21ec3a3-0761-4543-ad43-dca2913d159b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:54:44 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-b04a8a65-28ce-46b7-8c67-fd9c7f298b50 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "c21ec3a3-0761-4543-ad43-dca2913d159b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:54:44 user nova-compute[71428]: INFO nova.compute.manager [None req-b04a8a65-28ce-46b7-8c67-fd9c7f298b50 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Terminating instance Apr 23 03:54:44 user nova-compute[71428]: DEBUG nova.compute.manager [None req-b04a8a65-28ce-46b7-8c67-fd9c7f298b50 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Start destroying the instance on the hypervisor. {{(pid=71428) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 23 03:54:44 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:54:44 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:54:44 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:54:44 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:54:44 user nova-compute[71428]: DEBUG nova.compute.manager [req-f8307e94-0a11-4f98-8a96-88c15a4f34b2 req-e0bbd537-cc57-42ba-b402-4965baa1d86f service nova] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Received event network-vif-unplugged-3cc621aa-a5f6-46de-bfd8-32dada7ca345 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:54:44 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-f8307e94-0a11-4f98-8a96-88c15a4f34b2 req-e0bbd537-cc57-42ba-b402-4965baa1d86f service nova] Acquiring lock "c21ec3a3-0761-4543-ad43-dca2913d159b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:54:44 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-f8307e94-0a11-4f98-8a96-88c15a4f34b2 req-e0bbd537-cc57-42ba-b402-4965baa1d86f service nova] Lock "c21ec3a3-0761-4543-ad43-dca2913d159b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:54:44 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-f8307e94-0a11-4f98-8a96-88c15a4f34b2 req-e0bbd537-cc57-42ba-b402-4965baa1d86f service nova] Lock "c21ec3a3-0761-4543-ad43-dca2913d159b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:54:44 user nova-compute[71428]: DEBUG nova.compute.manager [req-f8307e94-0a11-4f98-8a96-88c15a4f34b2 req-e0bbd537-cc57-42ba-b402-4965baa1d86f service nova] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] No waiting events found dispatching network-vif-unplugged-3cc621aa-a5f6-46de-bfd8-32dada7ca345 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:54:44 user nova-compute[71428]: DEBUG nova.compute.manager [req-f8307e94-0a11-4f98-8a96-88c15a4f34b2 req-e0bbd537-cc57-42ba-b402-4965baa1d86f service nova] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Received event network-vif-unplugged-3cc621aa-a5f6-46de-bfd8-32dada7ca345 for instance with task_state deleting. {{(pid=71428) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 23 03:54:44 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:54:45 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Instance destroyed successfully. Apr 23 03:54:45 user nova-compute[71428]: DEBUG nova.objects.instance [None req-b04a8a65-28ce-46b7-8c67-fd9c7f298b50 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lazy-loading 'resources' on Instance uuid c21ec3a3-0761-4543-ad43-dca2913d159b {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:54:45 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-b04a8a65-28ce-46b7-8c67-fd9c7f298b50 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:52:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-159009824',display_name='tempest-AttachVolumeNegativeTest-server-159009824',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-159009824',id=16,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIs7ux48o2nnGrdeOCvQDySG+P6nXioDJdGS6oeGudA0JHf7ZhHzg403SyXAde6YT5+5r+rVL2ZP2ipzilgOZRPzvYzNtjeBNGfwBpGBza8LD5gUqZ8keZPqVn3njSi8NQ==',key_name='tempest-keypair-325673571',keypairs=,launch_index=0,launched_at=2023-04-23T03:52:58Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='24fff486a500421397ecb935828582cd',ramdisk_id='',reservation_id='r-ne1tzaqb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeNegativeTest-636753786',owner_user_name='tempest-AttachVolumeNegativeTest-636753786-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-23T03:52:59Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0a2c459cad014b07b2613e5e261d88aa',uuid=c21ec3a3-0761-4543-ad43-dca2913d159b,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3cc621aa-a5f6-46de-bfd8-32dada7ca345", "address": "fa:16:3e:e6:30:15", "network": {"id": "6adb189c-9b46-4da4-a34b-31edf8685498", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-759967402-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.154", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "24fff486a500421397ecb935828582cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cc621aa-a5", "ovs_interfaceid": "3cc621aa-a5f6-46de-bfd8-32dada7ca345", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 23 03:54:45 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-b04a8a65-28ce-46b7-8c67-fd9c7f298b50 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Converting VIF {"id": "3cc621aa-a5f6-46de-bfd8-32dada7ca345", "address": "fa:16:3e:e6:30:15", "network": {"id": "6adb189c-9b46-4da4-a34b-31edf8685498", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-759967402-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.154", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "24fff486a500421397ecb935828582cd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap3cc621aa-a5", "ovs_interfaceid": "3cc621aa-a5f6-46de-bfd8-32dada7ca345", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:54:45 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-b04a8a65-28ce-46b7-8c67-fd9c7f298b50 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e6:30:15,bridge_name='br-int',has_traffic_filtering=True,id=3cc621aa-a5f6-46de-bfd8-32dada7ca345,network=Network(6adb189c-9b46-4da4-a34b-31edf8685498),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cc621aa-a5') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:54:45 user nova-compute[71428]: DEBUG os_vif [None req-b04a8a65-28ce-46b7-8c67-fd9c7f298b50 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e6:30:15,bridge_name='br-int',has_traffic_filtering=True,id=3cc621aa-a5f6-46de-bfd8-32dada7ca345,network=Network(6adb189c-9b46-4da4-a34b-31edf8685498),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cc621aa-a5') {{(pid=71428) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 23 03:54:45 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:54:45 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3cc621aa-a5, bridge=br-int, if_exists=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:54:45 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:54:45 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:54:45 user nova-compute[71428]: INFO os_vif [None req-b04a8a65-28ce-46b7-8c67-fd9c7f298b50 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e6:30:15,bridge_name='br-int',has_traffic_filtering=True,id=3cc621aa-a5f6-46de-bfd8-32dada7ca345,network=Network(6adb189c-9b46-4da4-a34b-31edf8685498),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3cc621aa-a5') Apr 23 03:54:45 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-b04a8a65-28ce-46b7-8c67-fd9c7f298b50 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Deleting instance files /opt/stack/data/nova/instances/c21ec3a3-0761-4543-ad43-dca2913d159b_del Apr 23 03:54:45 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-b04a8a65-28ce-46b7-8c67-fd9c7f298b50 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Deletion of /opt/stack/data/nova/instances/c21ec3a3-0761-4543-ad43-dca2913d159b_del complete Apr 23 03:54:45 user nova-compute[71428]: INFO nova.compute.manager [None req-b04a8a65-28ce-46b7-8c67-fd9c7f298b50 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Took 0.83 seconds to destroy the instance on the hypervisor. Apr 23 03:54:45 user nova-compute[71428]: DEBUG oslo.service.loopingcall [None req-b04a8a65-28ce-46b7-8c67-fd9c7f298b50 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71428) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 23 03:54:45 user nova-compute[71428]: DEBUG nova.compute.manager [-] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Deallocating network for instance {{(pid=71428) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 23 03:54:45 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] deallocate_for_instance() {{(pid=71428) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 23 03:54:46 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Updating instance_info_cache with network_info: [] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:54:46 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Took 0.96 seconds to deallocate network for instance. Apr 23 03:54:46 user nova-compute[71428]: DEBUG nova.compute.manager [req-6510ea6f-1d62-49ff-bcc5-e0bcb4aa8d8a req-67f75e95-96aa-4b44-9332-9b69ec436acd service nova] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Received event network-vif-deleted-3cc621aa-a5f6-46de-bfd8-32dada7ca345 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:54:46 user nova-compute[71428]: INFO nova.compute.manager [req-6510ea6f-1d62-49ff-bcc5-e0bcb4aa8d8a req-67f75e95-96aa-4b44-9332-9b69ec436acd service nova] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Neutron deleted interface 3cc621aa-a5f6-46de-bfd8-32dada7ca345; detaching it from the instance and deleting it from the info cache Apr 23 03:54:46 user nova-compute[71428]: DEBUG nova.network.neutron [req-6510ea6f-1d62-49ff-bcc5-e0bcb4aa8d8a req-67f75e95-96aa-4b44-9332-9b69ec436acd service nova] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Updating instance_info_cache with network_info: [] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:54:46 user nova-compute[71428]: DEBUG nova.compute.manager [req-6510ea6f-1d62-49ff-bcc5-e0bcb4aa8d8a req-67f75e95-96aa-4b44-9332-9b69ec436acd service nova] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Detach interface failed, port_id=3cc621aa-a5f6-46de-bfd8-32dada7ca345, reason: Instance c21ec3a3-0761-4543-ad43-dca2913d159b could not be found. {{(pid=71428) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 23 03:54:46 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-b04a8a65-28ce-46b7-8c67-fd9c7f298b50 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:54:46 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-b04a8a65-28ce-46b7-8c67-fd9c7f298b50 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:54:46 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-b04a8a65-28ce-46b7-8c67-fd9c7f298b50 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:54:46 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-b04a8a65-28ce-46b7-8c67-fd9c7f298b50 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:54:46 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-b04a8a65-28ce-46b7-8c67-fd9c7f298b50 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.190s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:54:46 user nova-compute[71428]: INFO nova.scheduler.client.report [None req-b04a8a65-28ce-46b7-8c67-fd9c7f298b50 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Deleted allocations for instance c21ec3a3-0761-4543-ad43-dca2913d159b Apr 23 03:54:46 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-b04a8a65-28ce-46b7-8c67-fd9c7f298b50 tempest-AttachVolumeNegativeTest-636753786 tempest-AttachVolumeNegativeTest-636753786-project-member] Lock "c21ec3a3-0761-4543-ad43-dca2913d159b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.167s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:54:46 user nova-compute[71428]: DEBUG nova.compute.manager [req-639c1637-b853-44ab-abc5-8b324bfe76fc req-228f655a-d9de-4628-a06f-920532afb42e service nova] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Received event network-vif-plugged-3cc621aa-a5f6-46de-bfd8-32dada7ca345 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:54:46 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-639c1637-b853-44ab-abc5-8b324bfe76fc req-228f655a-d9de-4628-a06f-920532afb42e service nova] Acquiring lock "c21ec3a3-0761-4543-ad43-dca2913d159b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:54:46 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-639c1637-b853-44ab-abc5-8b324bfe76fc req-228f655a-d9de-4628-a06f-920532afb42e service nova] Lock "c21ec3a3-0761-4543-ad43-dca2913d159b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:54:46 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-639c1637-b853-44ab-abc5-8b324bfe76fc req-228f655a-d9de-4628-a06f-920532afb42e service nova] Lock "c21ec3a3-0761-4543-ad43-dca2913d159b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:54:46 user nova-compute[71428]: DEBUG nova.compute.manager [req-639c1637-b853-44ab-abc5-8b324bfe76fc req-228f655a-d9de-4628-a06f-920532afb42e service nova] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] No waiting events found dispatching network-vif-plugged-3cc621aa-a5f6-46de-bfd8-32dada7ca345 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:54:46 user nova-compute[71428]: WARNING nova.compute.manager [req-639c1637-b853-44ab-abc5-8b324bfe76fc req-228f655a-d9de-4628-a06f-920532afb42e service nova] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Received unexpected event network-vif-plugged-3cc621aa-a5f6-46de-bfd8-32dada7ca345 for instance with vm_state deleted and task_state None. Apr 23 03:54:48 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:54:50 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:54:50 user nova-compute[71428]: DEBUG nova.compute.manager [None req-a0e9a895-79cc-4bcd-a4e9-872d1a118eb6 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:54:50 user nova-compute[71428]: INFO nova.compute.manager [None req-a0e9a895-79cc-4bcd-a4e9-872d1a118eb6 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] instance snapshotting Apr 23 03:54:50 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-a0e9a895-79cc-4bcd-a4e9-872d1a118eb6 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Beginning live snapshot process Apr 23 03:54:50 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-a0e9a895-79cc-4bcd-a4e9-872d1a118eb6 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97b28eb9-634c-4f9b-97ef-41d9f45f19b7/disk --force-share --output=json -f qcow2 {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:54:50 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-a0e9a895-79cc-4bcd-a4e9-872d1a118eb6 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97b28eb9-634c-4f9b-97ef-41d9f45f19b7/disk --force-share --output=json -f qcow2" returned: 0 in 0.132s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:54:50 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-a0e9a895-79cc-4bcd-a4e9-872d1a118eb6 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97b28eb9-634c-4f9b-97ef-41d9f45f19b7/disk --force-share --output=json -f qcow2 {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:54:50 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-a0e9a895-79cc-4bcd-a4e9-872d1a118eb6 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97b28eb9-634c-4f9b-97ef-41d9f45f19b7/disk --force-share --output=json -f qcow2" returned: 0 in 0.135s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:54:50 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-a0e9a895-79cc-4bcd-a4e9-872d1a118eb6 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:54:50 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-a0e9a895-79cc-4bcd-a4e9-872d1a118eb6 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.128s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:54:50 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-a0e9a895-79cc-4bcd-a4e9-872d1a118eb6 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmpq_f8q8m9/ab8a979cd9fb4cf7bac770b152612edc.delta 1073741824 {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:54:51 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-a0e9a895-79cc-4bcd-a4e9-872d1a118eb6 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmpq_f8q8m9/ab8a979cd9fb4cf7bac770b152612edc.delta 1073741824" returned: 0 in 0.051s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:54:51 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-a0e9a895-79cc-4bcd-a4e9-872d1a118eb6 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Quiescing instance not available: QEMU guest agent is not enabled. Apr 23 03:54:51 user nova-compute[71428]: DEBUG nova.virt.libvirt.guest [None req-a0e9a895-79cc-4bcd-a4e9-872d1a118eb6 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] COPY block job progress, current cursor: 0 final cursor: 43778048 {{(pid=71428) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 23 03:54:52 user nova-compute[71428]: DEBUG nova.virt.libvirt.guest [None req-a0e9a895-79cc-4bcd-a4e9-872d1a118eb6 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] COPY block job progress, current cursor: 43778048 final cursor: 43778048 {{(pid=71428) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 23 03:54:52 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-a0e9a895-79cc-4bcd-a4e9-872d1a118eb6 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Skipping quiescing instance: QEMU guest agent is not enabled. Apr 23 03:54:52 user nova-compute[71428]: DEBUG nova.privsep.utils [None req-a0e9a895-79cc-4bcd-a4e9-872d1a118eb6 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71428) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 23 03:54:52 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-a0e9a895-79cc-4bcd-a4e9-872d1a118eb6 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmpq_f8q8m9/ab8a979cd9fb4cf7bac770b152612edc.delta /opt/stack/data/nova/instances/snapshots/tmpq_f8q8m9/ab8a979cd9fb4cf7bac770b152612edc {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:54:52 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-a0e9a895-79cc-4bcd-a4e9-872d1a118eb6 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmpq_f8q8m9/ab8a979cd9fb4cf7bac770b152612edc.delta /opt/stack/data/nova/instances/snapshots/tmpq_f8q8m9/ab8a979cd9fb4cf7bac770b152612edc" returned: 0 in 0.334s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:54:52 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-a0e9a895-79cc-4bcd-a4e9-872d1a118eb6 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Snapshot extracted, beginning image upload Apr 23 03:54:54 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-a0e9a895-79cc-4bcd-a4e9-872d1a118eb6 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Snapshot image upload complete Apr 23 03:54:54 user nova-compute[71428]: INFO nova.compute.manager [None req-a0e9a895-79cc-4bcd-a4e9-872d1a118eb6 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Took 4.32 seconds to snapshot the instance on the hypervisor. Apr 23 03:54:55 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:55:00 user nova-compute[71428]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:55:00 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] VM Stopped (Lifecycle Event) Apr 23 03:55:00 user nova-compute[71428]: DEBUG nova.compute.manager [None req-582172a0-a193-4e1d-8e56-309c9fdda2f7 None None] [instance: c21ec3a3-0761-4543-ad43-dca2913d159b] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:55:00 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:55:05 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:55:05 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:55:05 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71428) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 23 03:55:05 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 03:55:05 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 03:55:05 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:55:07 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:55:10 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:55:10 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:55:12 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:55:13 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:55:13 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71428) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 23 03:55:15 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:55:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-5263524d-2251-4242-b133-6a2da954fdbb tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Acquiring lock "954f41b2-5c67-41a4-b38b-ebf3ad60cac7" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:55:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-5263524d-2251-4242-b133-6a2da954fdbb tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Lock "954f41b2-5c67-41a4-b38b-ebf3ad60cac7" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:55:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-5263524d-2251-4242-b133-6a2da954fdbb tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Acquiring lock "954f41b2-5c67-41a4-b38b-ebf3ad60cac7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:55:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-5263524d-2251-4242-b133-6a2da954fdbb tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Lock "954f41b2-5c67-41a4-b38b-ebf3ad60cac7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.002s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:55:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-5263524d-2251-4242-b133-6a2da954fdbb tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Lock "954f41b2-5c67-41a4-b38b-ebf3ad60cac7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:55:17 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:55:17 user nova-compute[71428]: INFO nova.compute.manager [None req-5263524d-2251-4242-b133-6a2da954fdbb tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Terminating instance Apr 23 03:55:17 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:55:17 user nova-compute[71428]: DEBUG nova.compute.manager [None req-5263524d-2251-4242-b133-6a2da954fdbb tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Start destroying the instance on the hypervisor. {{(pid=71428) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 23 03:55:17 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:55:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:55:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:55:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:55:17 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Auditing locally available compute resources for user (node: user) {{(pid=71428) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 23 03:55:17 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:55:17 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:55:17 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:55:17 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:55:17 user nova-compute[71428]: DEBUG nova.compute.manager [req-b1de1bfa-78e2-40de-a8a6-c222efec6a6a req-4ba51817-8276-4310-b4b7-7c4df86e927d service nova] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Received event network-vif-unplugged-97d945a1-b86e-4a6d-af52-23084b8eb175 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:55:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-b1de1bfa-78e2-40de-a8a6-c222efec6a6a req-4ba51817-8276-4310-b4b7-7c4df86e927d service nova] Acquiring lock "954f41b2-5c67-41a4-b38b-ebf3ad60cac7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:55:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-b1de1bfa-78e2-40de-a8a6-c222efec6a6a req-4ba51817-8276-4310-b4b7-7c4df86e927d service nova] Lock "954f41b2-5c67-41a4-b38b-ebf3ad60cac7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:55:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-b1de1bfa-78e2-40de-a8a6-c222efec6a6a req-4ba51817-8276-4310-b4b7-7c4df86e927d service nova] Lock "954f41b2-5c67-41a4-b38b-ebf3ad60cac7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:55:17 user nova-compute[71428]: DEBUG nova.compute.manager [req-b1de1bfa-78e2-40de-a8a6-c222efec6a6a req-4ba51817-8276-4310-b4b7-7c4df86e927d service nova] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] No waiting events found dispatching network-vif-unplugged-97d945a1-b86e-4a6d-af52-23084b8eb175 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:55:17 user nova-compute[71428]: DEBUG nova.compute.manager [req-b1de1bfa-78e2-40de-a8a6-c222efec6a6a req-4ba51817-8276-4310-b4b7-7c4df86e927d service nova] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Received event network-vif-unplugged-97d945a1-b86e-4a6d-af52-23084b8eb175 for instance with task_state deleting. {{(pid=71428) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 23 03:55:18 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Instance destroyed successfully. Apr 23 03:55:18 user nova-compute[71428]: DEBUG nova.objects.instance [None req-5263524d-2251-4242-b133-6a2da954fdbb tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Lazy-loading 'resources' on Instance uuid 954f41b2-5c67-41a4-b38b-ebf3ad60cac7 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:55:18 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-5263524d-2251-4242-b133-6a2da954fdbb tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:47:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1796045730',display_name='tempest-ServerActionsTestJSON-server-1796045730',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serveractionstestjson-server-1796045730',id=5,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIkvMNG9bJWjIet22ePA1ljN9zECExBTHpeziEsLPMaW1zdmYmHvghy1cyRU7fOajJrIJuDX5HLH+Bcl14bDAwDDg/IMdmy4fis8XPrCxEDMbEbjK+Rs+0/rwp2gYOoLwQ==',key_name='tempest-keypair-1222563018',keypairs=,launch_index=0,launched_at=2023-04-23T03:47:26Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='6f44fe9ba07e4e2f925d1a8952356e04',ramdisk_id='',reservation_id='r-7q0n4thm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerActionsTestJSON-1398779434',owner_user_name='tempest-ServerActionsTestJSON-1398779434-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-23T03:47:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='cdc8fe94058c46c28d4b3f16dc1e77ed',uuid=954f41b2-5c67-41a4-b38b-ebf3ad60cac7,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "97d945a1-b86e-4a6d-af52-23084b8eb175", "address": "fa:16:3e:ee:63:58", "network": {"id": "bc639aed-946f-458c-b08d-ce3037f5507c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-558421264-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6f44fe9ba07e4e2f925d1a8952356e04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap97d945a1-b8", "ovs_interfaceid": "97d945a1-b86e-4a6d-af52-23084b8eb175", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 23 03:55:18 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-5263524d-2251-4242-b133-6a2da954fdbb tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Converting VIF {"id": "97d945a1-b86e-4a6d-af52-23084b8eb175", "address": "fa:16:3e:ee:63:58", "network": {"id": "bc639aed-946f-458c-b08d-ce3037f5507c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-558421264-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6f44fe9ba07e4e2f925d1a8952356e04", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap97d945a1-b8", "ovs_interfaceid": "97d945a1-b86e-4a6d-af52-23084b8eb175", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:55:18 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-5263524d-2251-4242-b133-6a2da954fdbb tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ee:63:58,bridge_name='br-int',has_traffic_filtering=True,id=97d945a1-b86e-4a6d-af52-23084b8eb175,network=Network(bc639aed-946f-458c-b08d-ce3037f5507c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97d945a1-b8') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:55:18 user nova-compute[71428]: DEBUG os_vif [None req-5263524d-2251-4242-b133-6a2da954fdbb tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ee:63:58,bridge_name='br-int',has_traffic_filtering=True,id=97d945a1-b86e-4a6d-af52-23084b8eb175,network=Network(bc639aed-946f-458c-b08d-ce3037f5507c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97d945a1-b8') {{(pid=71428) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 23 03:55:18 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:55:18 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap97d945a1-b8, bridge=br-int, if_exists=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:55:18 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:55:18 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:55:18 user nova-compute[71428]: INFO os_vif [None req-5263524d-2251-4242-b133-6a2da954fdbb tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ee:63:58,bridge_name='br-int',has_traffic_filtering=True,id=97d945a1-b86e-4a6d-af52-23084b8eb175,network=Network(bc639aed-946f-458c-b08d-ce3037f5507c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97d945a1-b8') Apr 23 03:55:18 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-5263524d-2251-4242-b133-6a2da954fdbb tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Deleting instance files /opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7_del Apr 23 03:55:18 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-5263524d-2251-4242-b133-6a2da954fdbb tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Deletion of /opt/stack/data/nova/instances/954f41b2-5c67-41a4-b38b-ebf3ad60cac7_del complete Apr 23 03:55:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97b28eb9-634c-4f9b-97ef-41d9f45f19b7/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:55:18 user nova-compute[71428]: INFO nova.compute.manager [None req-5263524d-2251-4242-b133-6a2da954fdbb tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Took 0.73 seconds to destroy the instance on the hypervisor. Apr 23 03:55:18 user nova-compute[71428]: DEBUG oslo.service.loopingcall [None req-5263524d-2251-4242-b133-6a2da954fdbb tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71428) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 23 03:55:18 user nova-compute[71428]: DEBUG nova.compute.manager [-] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Deallocating network for instance {{(pid=71428) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 23 03:55:18 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] deallocate_for_instance() {{(pid=71428) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 23 03:55:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97b28eb9-634c-4f9b-97ef-41d9f45f19b7/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:55:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97b28eb9-634c-4f9b-97ef-41d9f45f19b7/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:55:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97b28eb9-634c-4f9b-97ef-41d9f45f19b7/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:55:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:55:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:55:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:55:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:55:18 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Error from libvirt while getting description of instance-00000005: [Error Code 42] Domain not found: no domain with matching uuid '954f41b2-5c67-41a4-b38b-ebf3ad60cac7' (instance-00000005): libvirt.libvirtError: Domain not found: no domain with matching uuid '954f41b2-5c67-41a4-b38b-ebf3ad60cac7' (instance-00000005) Apr 23 03:55:19 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:55:19 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:55:19 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Hypervisor/Node resource view: name=user free_ram=8855MB free_disk=26.202960968017578GB free_vcpus=9 pci_devices=[{"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71428) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 23 03:55:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:55:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:55:19 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Updating instance_info_cache with network_info: [] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:55:19 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Took 0.98 seconds to deallocate network for instance. Apr 23 03:55:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-5263524d-2251-4242-b133-6a2da954fdbb tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:55:19 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 954f41b2-5c67-41a4-b38b-ebf3ad60cac7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:55:19 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance c8480a7c-b5de-4f66-a4f0-08fc679b0dfd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:55:19 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 97b28eb9-634c-4f9b-97ef-41d9f45f19b7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:55:19 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Total usable vcpus: 12, total allocated vcpus: 3 {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 23 03:55:19 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Final resource view: name=user phys_ram=16023MB used_ram=896MB phys_disk=40GB used_disk=3GB total_vcpus=12 used_vcpus=3 pci_stats=[] {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 23 03:55:19 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:55:19 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:55:19 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Compute_service record updated for user:user {{(pid=71428) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 23 03:55:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.241s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:55:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-5263524d-2251-4242-b133-6a2da954fdbb tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.161s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:55:19 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-5263524d-2251-4242-b133-6a2da954fdbb tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:55:19 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-5263524d-2251-4242-b133-6a2da954fdbb tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:55:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-5263524d-2251-4242-b133-6a2da954fdbb tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.157s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:55:19 user nova-compute[71428]: INFO nova.scheduler.client.report [None req-5263524d-2251-4242-b133-6a2da954fdbb tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Deleted allocations for instance 954f41b2-5c67-41a4-b38b-ebf3ad60cac7 Apr 23 03:55:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-5263524d-2251-4242-b133-6a2da954fdbb tempest-ServerActionsTestJSON-1398779434 tempest-ServerActionsTestJSON-1398779434-project-member] Lock "954f41b2-5c67-41a4-b38b-ebf3ad60cac7" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.210s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:55:20 user nova-compute[71428]: DEBUG nova.compute.manager [req-b076f8a8-db6c-401a-9ff2-d25c30df96e6 req-8d47748a-fb1e-4112-9fc7-838eda4f1553 service nova] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Received event network-vif-plugged-97d945a1-b86e-4a6d-af52-23084b8eb175 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:55:20 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-b076f8a8-db6c-401a-9ff2-d25c30df96e6 req-8d47748a-fb1e-4112-9fc7-838eda4f1553 service nova] Acquiring lock "954f41b2-5c67-41a4-b38b-ebf3ad60cac7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:55:20 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-b076f8a8-db6c-401a-9ff2-d25c30df96e6 req-8d47748a-fb1e-4112-9fc7-838eda4f1553 service nova] Lock "954f41b2-5c67-41a4-b38b-ebf3ad60cac7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:55:20 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-b076f8a8-db6c-401a-9ff2-d25c30df96e6 req-8d47748a-fb1e-4112-9fc7-838eda4f1553 service nova] Lock "954f41b2-5c67-41a4-b38b-ebf3ad60cac7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:55:20 user nova-compute[71428]: DEBUG nova.compute.manager [req-b076f8a8-db6c-401a-9ff2-d25c30df96e6 req-8d47748a-fb1e-4112-9fc7-838eda4f1553 service nova] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] No waiting events found dispatching network-vif-plugged-97d945a1-b86e-4a6d-af52-23084b8eb175 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:55:20 user nova-compute[71428]: WARNING nova.compute.manager [req-b076f8a8-db6c-401a-9ff2-d25c30df96e6 req-8d47748a-fb1e-4112-9fc7-838eda4f1553 service nova] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Received unexpected event network-vif-plugged-97d945a1-b86e-4a6d-af52-23084b8eb175 for instance with vm_state deleted and task_state None. Apr 23 03:55:20 user nova-compute[71428]: DEBUG nova.compute.manager [req-b076f8a8-db6c-401a-9ff2-d25c30df96e6 req-8d47748a-fb1e-4112-9fc7-838eda4f1553 service nova] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Received event network-vif-deleted-97d945a1-b86e-4a6d-af52-23084b8eb175 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:55:21 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:55:21 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Starting heal instance info cache {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 23 03:55:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "refresh_cache-c8480a7c-b5de-4f66-a4f0-08fc679b0dfd" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:55:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquired lock "refresh_cache-c8480a7c-b5de-4f66-a4f0-08fc679b0dfd" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:55:21 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Forcefully refreshing network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 23 03:55:22 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Updating instance_info_cache with network_info: [{"id": "e7548d21-c865-4648-a8a6-dba748a01e14", "address": "fa:16:3e:08:69:16", "network": {"id": "25de9e6c-4c02-4658-b076-466681d96605", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1644366708-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a79e09b1c4ae4cc5ab11c3e56ee4f0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape7548d21-c8", "ovs_interfaceid": "e7548d21-c865-4648-a8a6-dba748a01e14", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:55:22 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Releasing lock "refresh_cache-c8480a7c-b5de-4f66-a4f0-08fc679b0dfd" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:55:22 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Updated the network info_cache for instance {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 23 03:55:22 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:55:22 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:55:22 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:55:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:55:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:55:24 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:55:28 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:55:28 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:55:28 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:55:30 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:55:33 user nova-compute[71428]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:55:33 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] VM Stopped (Lifecycle Event) Apr 23 03:55:33 user nova-compute[71428]: DEBUG nova.compute.manager [None req-38f53105-32d1-48ce-bab0-d8dd95423766 None None] [instance: 954f41b2-5c67-41a4-b38b-ebf3ad60cac7] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:55:33 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:55:33 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:55:36 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:55:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:55:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:55:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:55:39 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:55:43 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:55:48 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:55:48 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:55:48 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:55:53 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:55:57 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Acquiring lock "a3378117-215b-457c-b9ba-5099c5810e4d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:55:57 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "a3378117-215b-457c-b9ba-5099c5810e4d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:55:57 user nova-compute[71428]: DEBUG nova.compute.manager [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: a3378117-215b-457c-b9ba-5099c5810e4d] Starting instance... {{(pid=71428) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 23 03:55:57 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:55:57 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:55:57 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71428) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 23 03:55:57 user nova-compute[71428]: INFO nova.compute.claims [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: a3378117-215b-457c-b9ba-5099c5810e4d] Claim successful on node user Apr 23 03:55:57 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:55:57 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:55:57 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.247s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:55:57 user nova-compute[71428]: DEBUG nova.compute.manager [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: a3378117-215b-457c-b9ba-5099c5810e4d] Start building networks asynchronously for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 23 03:55:57 user nova-compute[71428]: DEBUG nova.compute.manager [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: a3378117-215b-457c-b9ba-5099c5810e4d] Allocating IP information in the background. {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 23 03:55:57 user nova-compute[71428]: DEBUG nova.network.neutron [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: a3378117-215b-457c-b9ba-5099c5810e4d] allocate_for_instance() {{(pid=71428) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 23 03:55:57 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: a3378117-215b-457c-b9ba-5099c5810e4d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 23 03:55:57 user nova-compute[71428]: DEBUG nova.compute.manager [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: a3378117-215b-457c-b9ba-5099c5810e4d] Start building block device mappings for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 23 03:55:57 user nova-compute[71428]: INFO nova.virt.block_device [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: a3378117-215b-457c-b9ba-5099c5810e4d] Booting with volume-backed-image e6127373-9931-4277-9458-eceef653ea1e at /dev/vda Apr 23 03:55:57 user nova-compute[71428]: DEBUG nova.policy [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '99495726467944c38620831fc93e2856', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a79e09b1c4ae4cc5ab11c3e56ee4f0d9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71428) authorize /opt/stack/nova/nova/policy.py:203}} Apr 23 03:55:58 user nova-compute[71428]: WARNING nova.compute.manager [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Volume id: 14ad1cf7-6bce-4422-adc3-9fb2b9fa75a4 finished being created but its status is error. Apr 23 03:55:58 user nova-compute[71428]: ERROR nova.compute.manager [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: a3378117-215b-457c-b9ba-5099c5810e4d] Instance failed block device setup: nova.exception.VolumeNotCreated: Volume 14ad1cf7-6bce-4422-adc3-9fb2b9fa75a4 did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. Apr 23 03:55:58 user nova-compute[71428]: ERROR nova.compute.manager [instance: a3378117-215b-457c-b9ba-5099c5810e4d] Traceback (most recent call last): Apr 23 03:55:58 user nova-compute[71428]: ERROR nova.compute.manager [instance: a3378117-215b-457c-b9ba-5099c5810e4d] File "/opt/stack/nova/nova/compute/manager.py", line 2175, in _prep_block_device Apr 23 03:55:58 user nova-compute[71428]: ERROR nova.compute.manager [instance: a3378117-215b-457c-b9ba-5099c5810e4d] driver_block_device.attach_block_devices( Apr 23 03:55:58 user nova-compute[71428]: ERROR nova.compute.manager [instance: a3378117-215b-457c-b9ba-5099c5810e4d] File "/opt/stack/nova/nova/virt/block_device.py", line 936, in attach_block_devices Apr 23 03:55:58 user nova-compute[71428]: ERROR nova.compute.manager [instance: a3378117-215b-457c-b9ba-5099c5810e4d] _log_and_attach(device) Apr 23 03:55:58 user nova-compute[71428]: ERROR nova.compute.manager [instance: a3378117-215b-457c-b9ba-5099c5810e4d] File "/opt/stack/nova/nova/virt/block_device.py", line 933, in _log_and_attach Apr 23 03:55:58 user nova-compute[71428]: ERROR nova.compute.manager [instance: a3378117-215b-457c-b9ba-5099c5810e4d] bdm.attach(*attach_args, **attach_kwargs) Apr 23 03:55:58 user nova-compute[71428]: ERROR nova.compute.manager [instance: a3378117-215b-457c-b9ba-5099c5810e4d] File "/opt/stack/nova/nova/virt/block_device.py", line 831, in attach Apr 23 03:55:58 user nova-compute[71428]: ERROR nova.compute.manager [instance: a3378117-215b-457c-b9ba-5099c5810e4d] self.volume_id, self.attachment_id = self._create_volume( Apr 23 03:55:58 user nova-compute[71428]: ERROR nova.compute.manager [instance: a3378117-215b-457c-b9ba-5099c5810e4d] File "/opt/stack/nova/nova/virt/block_device.py", line 435, in _create_volume Apr 23 03:55:58 user nova-compute[71428]: ERROR nova.compute.manager [instance: a3378117-215b-457c-b9ba-5099c5810e4d] self._call_wait_func(context, wait_func, volume_api, vol['id']) Apr 23 03:55:58 user nova-compute[71428]: ERROR nova.compute.manager [instance: a3378117-215b-457c-b9ba-5099c5810e4d] File "/opt/stack/nova/nova/virt/block_device.py", line 785, in _call_wait_func Apr 23 03:55:58 user nova-compute[71428]: ERROR nova.compute.manager [instance: a3378117-215b-457c-b9ba-5099c5810e4d] with excutils.save_and_reraise_exception(): Apr 23 03:55:58 user nova-compute[71428]: ERROR nova.compute.manager [instance: a3378117-215b-457c-b9ba-5099c5810e4d] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ Apr 23 03:55:58 user nova-compute[71428]: ERROR nova.compute.manager [instance: a3378117-215b-457c-b9ba-5099c5810e4d] self.force_reraise() Apr 23 03:55:58 user nova-compute[71428]: ERROR nova.compute.manager [instance: a3378117-215b-457c-b9ba-5099c5810e4d] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise Apr 23 03:55:58 user nova-compute[71428]: ERROR nova.compute.manager [instance: a3378117-215b-457c-b9ba-5099c5810e4d] raise self.value Apr 23 03:55:58 user nova-compute[71428]: ERROR nova.compute.manager [instance: a3378117-215b-457c-b9ba-5099c5810e4d] File "/opt/stack/nova/nova/virt/block_device.py", line 783, in _call_wait_func Apr 23 03:55:58 user nova-compute[71428]: ERROR nova.compute.manager [instance: a3378117-215b-457c-b9ba-5099c5810e4d] wait_func(context, volume_id) Apr 23 03:55:58 user nova-compute[71428]: ERROR nova.compute.manager [instance: a3378117-215b-457c-b9ba-5099c5810e4d] File "/opt/stack/nova/nova/compute/manager.py", line 1792, in _await_block_device_map_created Apr 23 03:55:58 user nova-compute[71428]: ERROR nova.compute.manager [instance: a3378117-215b-457c-b9ba-5099c5810e4d] raise exception.VolumeNotCreated(volume_id=vol_id, Apr 23 03:55:58 user nova-compute[71428]: ERROR nova.compute.manager [instance: a3378117-215b-457c-b9ba-5099c5810e4d] nova.exception.VolumeNotCreated: Volume 14ad1cf7-6bce-4422-adc3-9fb2b9fa75a4 did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. Apr 23 03:55:58 user nova-compute[71428]: ERROR nova.compute.manager [instance: a3378117-215b-457c-b9ba-5099c5810e4d] Apr 23 03:55:58 user nova-compute[71428]: DEBUG nova.network.neutron [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: a3378117-215b-457c-b9ba-5099c5810e4d] Successfully created port: a398adce-d4fb-46e2-a8b1-38f906316a55 {{(pid=71428) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 23 03:55:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:55:58 user nova-compute[71428]: DEBUG nova.network.neutron [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: a3378117-215b-457c-b9ba-5099c5810e4d] Successfully updated port: a398adce-d4fb-46e2-a8b1-38f906316a55 {{(pid=71428) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 23 03:55:59 user nova-compute[71428]: DEBUG nova.compute.manager [req-9fb46229-04f3-495b-81bb-bbd17196e754 req-798bcddc-ecf9-425f-a74e-8016380f9917 service nova] [instance: a3378117-215b-457c-b9ba-5099c5810e4d] Received event network-changed-a398adce-d4fb-46e2-a8b1-38f906316a55 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:55:59 user nova-compute[71428]: DEBUG nova.compute.manager [req-9fb46229-04f3-495b-81bb-bbd17196e754 req-798bcddc-ecf9-425f-a74e-8016380f9917 service nova] [instance: a3378117-215b-457c-b9ba-5099c5810e4d] Refreshing instance network info cache due to event network-changed-a398adce-d4fb-46e2-a8b1-38f906316a55. {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 23 03:55:59 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-9fb46229-04f3-495b-81bb-bbd17196e754 req-798bcddc-ecf9-425f-a74e-8016380f9917 service nova] Acquiring lock "refresh_cache-a3378117-215b-457c-b9ba-5099c5810e4d" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:55:59 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-9fb46229-04f3-495b-81bb-bbd17196e754 req-798bcddc-ecf9-425f-a74e-8016380f9917 service nova] Acquired lock "refresh_cache-a3378117-215b-457c-b9ba-5099c5810e4d" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:55:59 user nova-compute[71428]: DEBUG nova.network.neutron [req-9fb46229-04f3-495b-81bb-bbd17196e754 req-798bcddc-ecf9-425f-a74e-8016380f9917 service nova] [instance: a3378117-215b-457c-b9ba-5099c5810e4d] Refreshing network info cache for port a398adce-d4fb-46e2-a8b1-38f906316a55 {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 23 03:55:59 user nova-compute[71428]: DEBUG nova.network.neutron [req-9fb46229-04f3-495b-81bb-bbd17196e754 req-798bcddc-ecf9-425f-a74e-8016380f9917 service nova] [instance: a3378117-215b-457c-b9ba-5099c5810e4d] Instance cache missing network info. {{(pid=71428) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 23 03:55:59 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Acquiring lock "refresh_cache-a3378117-215b-457c-b9ba-5099c5810e4d" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:55:59 user nova-compute[71428]: DEBUG nova.network.neutron [req-9fb46229-04f3-495b-81bb-bbd17196e754 req-798bcddc-ecf9-425f-a74e-8016380f9917 service nova] [instance: a3378117-215b-457c-b9ba-5099c5810e4d] Updating instance_info_cache with network_info: [] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:55:59 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-9fb46229-04f3-495b-81bb-bbd17196e754 req-798bcddc-ecf9-425f-a74e-8016380f9917 service nova] Releasing lock "refresh_cache-a3378117-215b-457c-b9ba-5099c5810e4d" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:55:59 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Acquired lock "refresh_cache-a3378117-215b-457c-b9ba-5099c5810e4d" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:55:59 user nova-compute[71428]: DEBUG nova.network.neutron [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: a3378117-215b-457c-b9ba-5099c5810e4d] Building network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 23 03:55:59 user nova-compute[71428]: DEBUG nova.network.neutron [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: a3378117-215b-457c-b9ba-5099c5810e4d] Instance cache missing network info. {{(pid=71428) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 23 03:55:59 user nova-compute[71428]: DEBUG nova.network.neutron [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: a3378117-215b-457c-b9ba-5099c5810e4d] Updating instance_info_cache with network_info: [{"id": "a398adce-d4fb-46e2-a8b1-38f906316a55", "address": "fa:16:3e:dd:4b:b8", "network": {"id": "25de9e6c-4c02-4658-b076-466681d96605", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1644366708-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a79e09b1c4ae4cc5ab11c3e56ee4f0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa398adce-d4", "ovs_interfaceid": "a398adce-d4fb-46e2-a8b1-38f906316a55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:55:59 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Releasing lock "refresh_cache-a3378117-215b-457c-b9ba-5099c5810e4d" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:55:59 user nova-compute[71428]: DEBUG nova.compute.manager [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: a3378117-215b-457c-b9ba-5099c5810e4d] Instance network_info: |[{"id": "a398adce-d4fb-46e2-a8b1-38f906316a55", "address": "fa:16:3e:dd:4b:b8", "network": {"id": "25de9e6c-4c02-4658-b076-466681d96605", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1644366708-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a79e09b1c4ae4cc5ab11c3e56ee4f0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa398adce-d4", "ovs_interfaceid": "a398adce-d4fb-46e2-a8b1-38f906316a55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 23 03:55:59 user nova-compute[71428]: DEBUG nova.compute.claims [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: a3378117-215b-457c-b9ba-5099c5810e4d] Aborting claim: {{(pid=71428) abort /opt/stack/nova/nova/compute/claims.py:84}} Apr 23 03:55:59 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:55:59 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:55:59 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:55:59 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:55:59 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.216s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:55:59 user nova-compute[71428]: DEBUG nova.compute.manager [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: a3378117-215b-457c-b9ba-5099c5810e4d] Build of instance a3378117-215b-457c-b9ba-5099c5810e4d aborted: Volume 14ad1cf7-6bce-4422-adc3-9fb2b9fa75a4 did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. {{(pid=71428) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2636}} Apr 23 03:55:59 user nova-compute[71428]: DEBUG nova.compute.utils [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: a3378117-215b-457c-b9ba-5099c5810e4d] Build of instance a3378117-215b-457c-b9ba-5099c5810e4d aborted: Volume 14ad1cf7-6bce-4422-adc3-9fb2b9fa75a4 did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. {{(pid=71428) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} Apr 23 03:55:59 user nova-compute[71428]: ERROR nova.compute.manager [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: a3378117-215b-457c-b9ba-5099c5810e4d] Build of instance a3378117-215b-457c-b9ba-5099c5810e4d aborted: Volume 14ad1cf7-6bce-4422-adc3-9fb2b9fa75a4 did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error.: nova.exception.BuildAbortException: Build of instance a3378117-215b-457c-b9ba-5099c5810e4d aborted: Volume 14ad1cf7-6bce-4422-adc3-9fb2b9fa75a4 did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. Apr 23 03:55:59 user nova-compute[71428]: DEBUG nova.compute.manager [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: a3378117-215b-457c-b9ba-5099c5810e4d] Unplugging VIFs for instance {{(pid=71428) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} Apr 23 03:55:59 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:55:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1441120197',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1441120197',id=18,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a79e09b1c4ae4cc5ab11c3e56ee4f0d9',ramdisk_id='',reservation_id='r-p43efv5w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-653032906',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member'},tags=TagList,task_state='block_device_mapping',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:55:57Z,user_data=None,user_id='99495726467944c38620831fc93e2856',uuid=a3378117-215b-457c-b9ba-5099c5810e4d,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a398adce-d4fb-46e2-a8b1-38f906316a55", "address": "fa:16:3e:dd:4b:b8", "network": {"id": "25de9e6c-4c02-4658-b076-466681d96605", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1644366708-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a79e09b1c4ae4cc5ab11c3e56ee4f0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa398adce-d4", "ovs_interfaceid": "a398adce-d4fb-46e2-a8b1-38f906316a55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 23 03:55:59 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Converting VIF {"id": "a398adce-d4fb-46e2-a8b1-38f906316a55", "address": "fa:16:3e:dd:4b:b8", "network": {"id": "25de9e6c-4c02-4658-b076-466681d96605", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1644366708-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a79e09b1c4ae4cc5ab11c3e56ee4f0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa398adce-d4", "ovs_interfaceid": "a398adce-d4fb-46e2-a8b1-38f906316a55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:55:59 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:4b:b8,bridge_name='br-int',has_traffic_filtering=True,id=a398adce-d4fb-46e2-a8b1-38f906316a55,network=Network(25de9e6c-4c02-4658-b076-466681d96605),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa398adce-d4') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:55:59 user nova-compute[71428]: DEBUG os_vif [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:4b:b8,bridge_name='br-int',has_traffic_filtering=True,id=a398adce-d4fb-46e2-a8b1-38f906316a55,network=Network(25de9e6c-4c02-4658-b076-466681d96605),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa398adce-d4') {{(pid=71428) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 23 03:55:59 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:55:59 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa398adce-d4, bridge=br-int, if_exists=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:55:59 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 23 03:55:59 user nova-compute[71428]: INFO os_vif [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:4b:b8,bridge_name='br-int',has_traffic_filtering=True,id=a398adce-d4fb-46e2-a8b1-38f906316a55,network=Network(25de9e6c-4c02-4658-b076-466681d96605),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa398adce-d4') Apr 23 03:55:59 user nova-compute[71428]: DEBUG nova.compute.manager [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: a3378117-215b-457c-b9ba-5099c5810e4d] Unplugged VIFs for instance {{(pid=71428) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} Apr 23 03:55:59 user nova-compute[71428]: DEBUG nova.compute.manager [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: a3378117-215b-457c-b9ba-5099c5810e4d] Deallocating network for instance {{(pid=71428) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 23 03:55:59 user nova-compute[71428]: DEBUG nova.network.neutron [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: a3378117-215b-457c-b9ba-5099c5810e4d] deallocate_for_instance() {{(pid=71428) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 23 03:56:00 user nova-compute[71428]: DEBUG nova.network.neutron [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: a3378117-215b-457c-b9ba-5099c5810e4d] Updating instance_info_cache with network_info: [] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:56:00 user nova-compute[71428]: INFO nova.compute.manager [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: a3378117-215b-457c-b9ba-5099c5810e4d] Took 0.52 seconds to deallocate network for instance. Apr 23 03:56:00 user nova-compute[71428]: INFO nova.scheduler.client.report [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Deleted allocations for instance a3378117-215b-457c-b9ba-5099c5810e4d Apr 23 03:56:00 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-54776d9f-a033-4c4c-a2a8-945c144136d1 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "a3378117-215b-457c-b9ba-5099c5810e4d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 3.492s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:56:03 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 3438-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:56:03 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:56:03 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71428) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 23 03:56:03 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 03:56:03 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 03:56:03 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:56:08 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:56:11 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:56:11 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:56:13 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:56:13 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:56:14 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:56:14 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71428) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 23 03:56:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:56:17 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:56:18 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:56:18 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:56:18 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:56:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:56:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:56:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:56:18 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Auditing locally available compute resources for user (node: user) {{(pid=71428) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 23 03:56:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97b28eb9-634c-4f9b-97ef-41d9f45f19b7/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:56:18 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:56:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97b28eb9-634c-4f9b-97ef-41d9f45f19b7/disk --force-share --output=json" returned: 0 in 0.145s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:56:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97b28eb9-634c-4f9b-97ef-41d9f45f19b7/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:56:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97b28eb9-634c-4f9b-97ef-41d9f45f19b7/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:56:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:56:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:56:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:56:19 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:56:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:56:19 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:56:19 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:56:19 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Hypervisor/Node resource view: name=user free_ram=8979MB free_disk=26.194087982177734GB free_vcpus=10 pci_devices=[{"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71428) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 23 03:56:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:56:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:56:19 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance c8480a7c-b5de-4f66-a4f0-08fc679b0dfd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:56:19 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 97b28eb9-634c-4f9b-97ef-41d9f45f19b7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:56:19 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Total usable vcpus: 12, total allocated vcpus: 2 {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 23 03:56:19 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Final resource view: name=user phys_ram=16023MB used_ram=768MB phys_disk=40GB used_disk=2GB total_vcpus=12 used_vcpus=2 pci_stats=[] {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 23 03:56:19 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:56:19 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:56:19 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Compute_service record updated for user:user {{(pid=71428) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 23 03:56:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.215s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:56:20 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Acquiring lock "ee89a9d5-0507-475c-bea2-e02e34d15710" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:56:20 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lock "ee89a9d5-0507-475c-bea2-e02e34d15710" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:56:20 user nova-compute[71428]: DEBUG nova.compute.manager [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Starting instance... {{(pid=71428) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 23 03:56:20 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:56:20 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:56:20 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:56:20 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71428) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 23 03:56:20 user nova-compute[71428]: INFO nova.compute.claims [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Claim successful on node user Apr 23 03:56:20 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:56:20 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:56:21 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:56:21 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:56:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.222s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:56:21 user nova-compute[71428]: DEBUG nova.compute.manager [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Start building networks asynchronously for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 23 03:56:21 user nova-compute[71428]: DEBUG nova.compute.manager [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Allocating IP information in the background. {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 23 03:56:21 user nova-compute[71428]: DEBUG nova.network.neutron [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] allocate_for_instance() {{(pid=71428) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 23 03:56:21 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 23 03:56:21 user nova-compute[71428]: DEBUG nova.compute.manager [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Start building block device mappings for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 23 03:56:21 user nova-compute[71428]: DEBUG nova.policy [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '45d9ae4a3b00492fa2fc16a22a34df09', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '39d6cb4cfcb2413f8039ee9febf1c046', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71428) authorize /opt/stack/nova/nova/policy.py:203}} Apr 23 03:56:21 user nova-compute[71428]: DEBUG nova.compute.manager [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Start spawning the instance on the hypervisor. {{(pid=71428) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 23 03:56:21 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Creating instance directory {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 23 03:56:21 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Creating image(s) Apr 23 03:56:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Acquiring lock "/opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:56:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lock "/opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:56:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lock "/opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:56:21 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:56:21 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.135s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:56:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Acquiring lock "935dc3474dddbde110531a31510941caddc0ae83" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:56:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:56:21 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:56:21 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:56:21 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Starting heal instance info cache {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 23 03:56:21 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.149s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:56:21 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk 1073741824 {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:56:21 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk 1073741824" returned: 0 in 0.048s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:56:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.202s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:56:21 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:56:21 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.142s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:56:21 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Checking if we can resize image /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk. size=1073741824 {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 23 03:56:21 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:56:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "refresh_cache-97b28eb9-634c-4f9b-97ef-41d9f45f19b7" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:56:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquired lock "refresh_cache-97b28eb9-634c-4f9b-97ef-41d9f45f19b7" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:56:21 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Forcefully refreshing network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 23 03:56:22 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk --force-share --output=json" returned: 0 in 0.148s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:56:22 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Cannot resize image /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk to a smaller size. {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 23 03:56:22 user nova-compute[71428]: DEBUG nova.objects.instance [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lazy-loading 'migration_context' on Instance uuid ee89a9d5-0507-475c-bea2-e02e34d15710 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:56:22 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Created local disks {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 23 03:56:22 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Ensure instance console log exists: /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/console.log {{(pid=71428) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 23 03:56:22 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:56:22 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:56:22 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:56:22 user nova-compute[71428]: DEBUG nova.network.neutron [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Successfully created port: 8948ed3a-de42-41ef-b7ca-c34a2d405b09 {{(pid=71428) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 23 03:56:22 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Updating instance_info_cache with network_info: [{"id": "6d65b2b7-2a9a-4e2b-b003-af767e8a544f", "address": "fa:16:3e:a0:05:16", "network": {"id": "25de9e6c-4c02-4658-b076-466681d96605", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1644366708-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a79e09b1c4ae4cc5ab11c3e56ee4f0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d65b2b7-2a", "ovs_interfaceid": "6d65b2b7-2a9a-4e2b-b003-af767e8a544f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:56:22 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Releasing lock "refresh_cache-97b28eb9-634c-4f9b-97ef-41d9f45f19b7" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:56:22 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Updated the network info_cache for instance {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 23 03:56:22 user nova-compute[71428]: DEBUG nova.network.neutron [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Successfully updated port: 8948ed3a-de42-41ef-b7ca-c34a2d405b09 {{(pid=71428) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Acquiring lock "refresh_cache-ee89a9d5-0507-475c-bea2-e02e34d15710" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Acquired lock "refresh_cache-ee89a9d5-0507-475c-bea2-e02e34d15710" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG nova.network.neutron [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Building network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG nova.compute.manager [req-b40aa04c-05bd-43af-a288-2b7e57327764 req-3500009f-3368-4dd8-ad51-7b954eab6762 service nova] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Received event network-changed-8948ed3a-de42-41ef-b7ca-c34a2d405b09 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG nova.compute.manager [req-b40aa04c-05bd-43af-a288-2b7e57327764 req-3500009f-3368-4dd8-ad51-7b954eab6762 service nova] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Refreshing instance network info cache due to event network-changed-8948ed3a-de42-41ef-b7ca-c34a2d405b09. {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-b40aa04c-05bd-43af-a288-2b7e57327764 req-3500009f-3368-4dd8-ad51-7b954eab6762 service nova] Acquiring lock "refresh_cache-ee89a9d5-0507-475c-bea2-e02e34d15710" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG nova.network.neutron [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Instance cache missing network info. {{(pid=71428) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG nova.network.neutron [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Updating instance_info_cache with network_info: [{"id": "8948ed3a-de42-41ef-b7ca-c34a2d405b09", "address": "fa:16:3e:d2:42:a8", "network": {"id": "083ef998-5361-46b7-827f-d0f086e9e552", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1112168066-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "39d6cb4cfcb2413f8039ee9febf1c046", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8948ed3a-de", "ovs_interfaceid": "8948ed3a-de42-41ef-b7ca-c34a2d405b09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Releasing lock "refresh_cache-ee89a9d5-0507-475c-bea2-e02e34d15710" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG nova.compute.manager [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Instance network_info: |[{"id": "8948ed3a-de42-41ef-b7ca-c34a2d405b09", "address": "fa:16:3e:d2:42:a8", "network": {"id": "083ef998-5361-46b7-827f-d0f086e9e552", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1112168066-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "39d6cb4cfcb2413f8039ee9febf1c046", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8948ed3a-de", "ovs_interfaceid": "8948ed3a-de42-41ef-b7ca-c34a2d405b09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-b40aa04c-05bd-43af-a288-2b7e57327764 req-3500009f-3368-4dd8-ad51-7b954eab6762 service nova] Acquired lock "refresh_cache-ee89a9d5-0507-475c-bea2-e02e34d15710" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG nova.network.neutron [req-b40aa04c-05bd-43af-a288-2b7e57327764 req-3500009f-3368-4dd8-ad51-7b954eab6762 service nova] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Refreshing network info cache for port 8948ed3a-de42-41ef-b7ca-c34a2d405b09 {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Start _get_guest_xml network_info=[{"id": "8948ed3a-de42-41ef-b7ca-c34a2d405b09", "address": "fa:16:3e:d2:42:a8", "network": {"id": "083ef998-5361-46b7-827f-d0f086e9e552", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1112168066-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "39d6cb4cfcb2413f8039ee9febf1c046", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8948ed3a-de", "ovs_interfaceid": "8948ed3a-de42-41ef-b7ca-c34a2d405b09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'image_id': 'e6127373-9931-4277-9458-eceef653ea1e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 23 03:56:23 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:56:23 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:56:23 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71428) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-23T03:43:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=), allow threads: True {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Flavor limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Image limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Flavor pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Image pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Got 1 possible topologies {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:56:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1352466124',display_name='tempest-ServersNegativeTestJSON-server-1352466124',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1352466124',id=19,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='39d6cb4cfcb2413f8039ee9febf1c046',ramdisk_id='',reservation_id='r-aw5j0sob',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1115615559',owner_user_name='tempest-ServersNegativeTestJSON-1115615559-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:56:21Z,user_data=None,user_id='45d9ae4a3b00492fa2fc16a22a34df09',uuid=ee89a9d5-0507-475c-bea2-e02e34d15710,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8948ed3a-de42-41ef-b7ca-c34a2d405b09", "address": "fa:16:3e:d2:42:a8", "network": {"id": "083ef998-5361-46b7-827f-d0f086e9e552", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1112168066-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "39d6cb4cfcb2413f8039ee9febf1c046", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8948ed3a-de", "ovs_interfaceid": "8948ed3a-de42-41ef-b7ca-c34a2d405b09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71428) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Converting VIF {"id": "8948ed3a-de42-41ef-b7ca-c34a2d405b09", "address": "fa:16:3e:d2:42:a8", "network": {"id": "083ef998-5361-46b7-827f-d0f086e9e552", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1112168066-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "39d6cb4cfcb2413f8039ee9febf1c046", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8948ed3a-de", "ovs_interfaceid": "8948ed3a-de42-41ef-b7ca-c34a2d405b09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d2:42:a8,bridge_name='br-int',has_traffic_filtering=True,id=8948ed3a-de42-41ef-b7ca-c34a2d405b09,network=Network(083ef998-5361-46b7-827f-d0f086e9e552),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8948ed3a-de') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG nova.objects.instance [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lazy-loading 'pci_devices' on Instance uuid ee89a9d5-0507-475c-bea2-e02e34d15710 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] End _get_guest_xml xml= Apr 23 03:56:23 user nova-compute[71428]: ee89a9d5-0507-475c-bea2-e02e34d15710 Apr 23 03:56:23 user nova-compute[71428]: instance-00000013 Apr 23 03:56:23 user nova-compute[71428]: 131072 Apr 23 03:56:23 user nova-compute[71428]: 1 Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: tempest-ServersNegativeTestJSON-server-1352466124 Apr 23 03:56:23 user nova-compute[71428]: 2023-04-23 03:56:23 Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: 128 Apr 23 03:56:23 user nova-compute[71428]: 1 Apr 23 03:56:23 user nova-compute[71428]: 0 Apr 23 03:56:23 user nova-compute[71428]: 0 Apr 23 03:56:23 user nova-compute[71428]: 1 Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: tempest-ServersNegativeTestJSON-1115615559-project-member Apr 23 03:56:23 user nova-compute[71428]: tempest-ServersNegativeTestJSON-1115615559 Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: OpenStack Foundation Apr 23 03:56:23 user nova-compute[71428]: OpenStack Nova Apr 23 03:56:23 user nova-compute[71428]: 0.0.0 Apr 23 03:56:23 user nova-compute[71428]: ee89a9d5-0507-475c-bea2-e02e34d15710 Apr 23 03:56:23 user nova-compute[71428]: ee89a9d5-0507-475c-bea2-e02e34d15710 Apr 23 03:56:23 user nova-compute[71428]: Virtual Machine Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: hvm Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Nehalem Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: /dev/urandom Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: Apr 23 03:56:23 user nova-compute[71428]: {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:56:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1352466124',display_name='tempest-ServersNegativeTestJSON-server-1352466124',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1352466124',id=19,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='39d6cb4cfcb2413f8039ee9febf1c046',ramdisk_id='',reservation_id='r-aw5j0sob',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1115615559',owner_user_name='tempest-ServersNegativeTestJSON-1115615559-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:56:21Z,user_data=None,user_id='45d9ae4a3b00492fa2fc16a22a34df09',uuid=ee89a9d5-0507-475c-bea2-e02e34d15710,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8948ed3a-de42-41ef-b7ca-c34a2d405b09", "address": "fa:16:3e:d2:42:a8", "network": {"id": "083ef998-5361-46b7-827f-d0f086e9e552", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1112168066-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "39d6cb4cfcb2413f8039ee9febf1c046", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8948ed3a-de", "ovs_interfaceid": "8948ed3a-de42-41ef-b7ca-c34a2d405b09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Converting VIF {"id": "8948ed3a-de42-41ef-b7ca-c34a2d405b09", "address": "fa:16:3e:d2:42:a8", "network": {"id": "083ef998-5361-46b7-827f-d0f086e9e552", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1112168066-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "39d6cb4cfcb2413f8039ee9febf1c046", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8948ed3a-de", "ovs_interfaceid": "8948ed3a-de42-41ef-b7ca-c34a2d405b09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d2:42:a8,bridge_name='br-int',has_traffic_filtering=True,id=8948ed3a-de42-41ef-b7ca-c34a2d405b09,network=Network(083ef998-5361-46b7-827f-d0f086e9e552),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8948ed3a-de') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG os_vif [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d2:42:a8,bridge_name='br-int',has_traffic_filtering=True,id=8948ed3a-de42-41ef-b7ca-c34a2d405b09,network=Network(083ef998-5361-46b7-827f-d0f086e9e552),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8948ed3a-de') {{(pid=71428) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8948ed3a-de, may_exist=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8948ed3a-de, col_values=(('external_ids', {'iface-id': '8948ed3a-de42-41ef-b7ca-c34a2d405b09', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d2:42:a8', 'vm-uuid': 'ee89a9d5-0507-475c-bea2-e02e34d15710'}),)) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:56:23 user nova-compute[71428]: INFO os_vif [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d2:42:a8,bridge_name='br-int',has_traffic_filtering=True,id=8948ed3a-de42-41ef-b7ca-c34a2d405b09,network=Network(083ef998-5361-46b7-827f-d0f086e9e552),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8948ed3a-de') Apr 23 03:56:23 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] No BDM found with device name vda, not building metadata. {{(pid=71428) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] No VIF found with MAC fa:16:3e:d2:42:a8, not building metadata {{(pid=71428) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 23 03:56:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:56:24 user nova-compute[71428]: DEBUG nova.network.neutron [req-b40aa04c-05bd-43af-a288-2b7e57327764 req-3500009f-3368-4dd8-ad51-7b954eab6762 service nova] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Updated VIF entry in instance network info cache for port 8948ed3a-de42-41ef-b7ca-c34a2d405b09. {{(pid=71428) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 23 03:56:24 user nova-compute[71428]: DEBUG nova.network.neutron [req-b40aa04c-05bd-43af-a288-2b7e57327764 req-3500009f-3368-4dd8-ad51-7b954eab6762 service nova] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Updating instance_info_cache with network_info: [{"id": "8948ed3a-de42-41ef-b7ca-c34a2d405b09", "address": "fa:16:3e:d2:42:a8", "network": {"id": "083ef998-5361-46b7-827f-d0f086e9e552", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1112168066-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "39d6cb4cfcb2413f8039ee9febf1c046", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8948ed3a-de", "ovs_interfaceid": "8948ed3a-de42-41ef-b7ca-c34a2d405b09", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:56:24 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-b40aa04c-05bd-43af-a288-2b7e57327764 req-3500009f-3368-4dd8-ad51-7b954eab6762 service nova] Releasing lock "refresh_cache-ee89a9d5-0507-475c-bea2-e02e34d15710" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:56:24 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:56:24 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:56:24 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:56:25 user nova-compute[71428]: DEBUG nova.compute.manager [req-71dddd89-db86-475f-8046-200902a259ee req-1f673c24-6409-4701-9f0b-a140a9ca565d service nova] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Received event network-vif-plugged-8948ed3a-de42-41ef-b7ca-c34a2d405b09 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:56:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-71dddd89-db86-475f-8046-200902a259ee req-1f673c24-6409-4701-9f0b-a140a9ca565d service nova] Acquiring lock "ee89a9d5-0507-475c-bea2-e02e34d15710-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:56:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-71dddd89-db86-475f-8046-200902a259ee req-1f673c24-6409-4701-9f0b-a140a9ca565d service nova] Lock "ee89a9d5-0507-475c-bea2-e02e34d15710-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:56:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-71dddd89-db86-475f-8046-200902a259ee req-1f673c24-6409-4701-9f0b-a140a9ca565d service nova] Lock "ee89a9d5-0507-475c-bea2-e02e34d15710-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:56:25 user nova-compute[71428]: DEBUG nova.compute.manager [req-71dddd89-db86-475f-8046-200902a259ee req-1f673c24-6409-4701-9f0b-a140a9ca565d service nova] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] No waiting events found dispatching network-vif-plugged-8948ed3a-de42-41ef-b7ca-c34a2d405b09 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:56:25 user nova-compute[71428]: WARNING nova.compute.manager [req-71dddd89-db86-475f-8046-200902a259ee req-1f673c24-6409-4701-9f0b-a140a9ca565d service nova] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Received unexpected event network-vif-plugged-8948ed3a-de42-41ef-b7ca-c34a2d405b09 for instance with vm_state building and task_state spawning. Apr 23 03:56:25 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:56:25 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:56:25 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:56:26 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Resumed> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:56:26 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] VM Resumed (Lifecycle Event) Apr 23 03:56:26 user nova-compute[71428]: DEBUG nova.compute.manager [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Instance event wait completed in 0 seconds for {{(pid=71428) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 23 03:56:26 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Guest created on hypervisor {{(pid=71428) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 23 03:56:26 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Instance spawned successfully. Apr 23 03:56:26 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 23 03:56:26 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:56:26 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:56:26 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Found default for hw_cdrom_bus of ide {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:56:26 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Found default for hw_disk_bus of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:56:26 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Found default for hw_input_bus of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:56:26 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Found default for hw_pointer_model of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:56:26 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Found default for hw_video_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:56:26 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Found default for hw_vif_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:56:26 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 03:56:26 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Started> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:56:26 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] VM Started (Lifecycle Event) Apr 23 03:56:26 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:56:26 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:56:26 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 03:56:26 user nova-compute[71428]: INFO nova.compute.manager [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Took 5.60 seconds to spawn the instance on the hypervisor. Apr 23 03:56:26 user nova-compute[71428]: DEBUG nova.compute.manager [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:56:27 user nova-compute[71428]: INFO nova.compute.manager [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Took 6.13 seconds to build instance. Apr 23 03:56:27 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-862e2bc2-53d4-4db7-bc09-c54da19c8013 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lock "ee89a9d5-0507-475c-bea2-e02e34d15710" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.222s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:56:27 user nova-compute[71428]: DEBUG nova.compute.manager [req-2c1a2488-bb4d-4311-bf04-e88b85d9b5bb req-e6beaa1e-d692-4f30-9210-2eeb81c33f37 service nova] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Received event network-vif-plugged-8948ed3a-de42-41ef-b7ca-c34a2d405b09 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:56:27 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-2c1a2488-bb4d-4311-bf04-e88b85d9b5bb req-e6beaa1e-d692-4f30-9210-2eeb81c33f37 service nova] Acquiring lock "ee89a9d5-0507-475c-bea2-e02e34d15710-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:56:27 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-2c1a2488-bb4d-4311-bf04-e88b85d9b5bb req-e6beaa1e-d692-4f30-9210-2eeb81c33f37 service nova] Lock "ee89a9d5-0507-475c-bea2-e02e34d15710-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:56:27 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-2c1a2488-bb4d-4311-bf04-e88b85d9b5bb req-e6beaa1e-d692-4f30-9210-2eeb81c33f37 service nova] Lock "ee89a9d5-0507-475c-bea2-e02e34d15710-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:56:27 user nova-compute[71428]: DEBUG nova.compute.manager [req-2c1a2488-bb4d-4311-bf04-e88b85d9b5bb req-e6beaa1e-d692-4f30-9210-2eeb81c33f37 service nova] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] No waiting events found dispatching network-vif-plugged-8948ed3a-de42-41ef-b7ca-c34a2d405b09 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:56:27 user nova-compute[71428]: WARNING nova.compute.manager [req-2c1a2488-bb4d-4311-bf04-e88b85d9b5bb req-e6beaa1e-d692-4f30-9210-2eeb81c33f37 service nova] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Received unexpected event network-vif-plugged-8948ed3a-de42-41ef-b7ca-c34a2d405b09 for instance with vm_state active and task_state None. Apr 23 03:56:28 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:56:28 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:56:33 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:56:33 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:56:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:56:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:56:43 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:56:43 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:56:47 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f72aa7bc-9edd-4cc4-9f97-c0e637f39a9f tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Acquiring lock "97b28eb9-634c-4f9b-97ef-41d9f45f19b7" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:56:47 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f72aa7bc-9edd-4cc4-9f97-c0e637f39a9f tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "97b28eb9-634c-4f9b-97ef-41d9f45f19b7" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:56:47 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f72aa7bc-9edd-4cc4-9f97-c0e637f39a9f tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Acquiring lock "97b28eb9-634c-4f9b-97ef-41d9f45f19b7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:56:47 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f72aa7bc-9edd-4cc4-9f97-c0e637f39a9f tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "97b28eb9-634c-4f9b-97ef-41d9f45f19b7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:56:47 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f72aa7bc-9edd-4cc4-9f97-c0e637f39a9f tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "97b28eb9-634c-4f9b-97ef-41d9f45f19b7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:56:47 user nova-compute[71428]: INFO nova.compute.manager [None req-f72aa7bc-9edd-4cc4-9f97-c0e637f39a9f tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Terminating instance Apr 23 03:56:47 user nova-compute[71428]: DEBUG nova.compute.manager [None req-f72aa7bc-9edd-4cc4-9f97-c0e637f39a9f tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Start destroying the instance on the hypervisor. {{(pid=71428) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 23 03:56:48 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:56:48 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:56:48 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:56:48 user nova-compute[71428]: DEBUG nova.compute.manager [req-90414942-330a-4887-83a7-371b3365b384 req-405f464b-1772-4dc0-9f5d-7e661b85f08a service nova] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Received event network-vif-unplugged-6d65b2b7-2a9a-4e2b-b003-af767e8a544f {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:56:48 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-90414942-330a-4887-83a7-371b3365b384 req-405f464b-1772-4dc0-9f5d-7e661b85f08a service nova] Acquiring lock "97b28eb9-634c-4f9b-97ef-41d9f45f19b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:56:48 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-90414942-330a-4887-83a7-371b3365b384 req-405f464b-1772-4dc0-9f5d-7e661b85f08a service nova] Lock "97b28eb9-634c-4f9b-97ef-41d9f45f19b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:56:48 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-90414942-330a-4887-83a7-371b3365b384 req-405f464b-1772-4dc0-9f5d-7e661b85f08a service nova] Lock "97b28eb9-634c-4f9b-97ef-41d9f45f19b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:56:48 user nova-compute[71428]: DEBUG nova.compute.manager [req-90414942-330a-4887-83a7-371b3365b384 req-405f464b-1772-4dc0-9f5d-7e661b85f08a service nova] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] No waiting events found dispatching network-vif-unplugged-6d65b2b7-2a9a-4e2b-b003-af767e8a544f {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:56:48 user nova-compute[71428]: DEBUG nova.compute.manager [req-90414942-330a-4887-83a7-371b3365b384 req-405f464b-1772-4dc0-9f5d-7e661b85f08a service nova] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Received event network-vif-unplugged-6d65b2b7-2a9a-4e2b-b003-af767e8a544f for instance with task_state deleting. {{(pid=71428) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 23 03:56:48 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:56:48 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:56:48 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Instance destroyed successfully. Apr 23 03:56:48 user nova-compute[71428]: DEBUG nova.objects.instance [None req-f72aa7bc-9edd-4cc4-9f97-c0e637f39a9f tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lazy-loading 'resources' on Instance uuid 97b28eb9-634c-4f9b-97ef-41d9f45f19b7 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:56:48 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-f72aa7bc-9edd-4cc4-9f97-c0e637f39a9f tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:52:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1016623574',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1016623574',id=17,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-23T03:53:05Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='a79e09b1c4ae4cc5ab11c3e56ee4f0d9',ramdisk_id='',reservation_id='r-y1kg0tql',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-653032906',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-23T03:54:55Z,user_data=None,user_id='99495726467944c38620831fc93e2856',uuid=97b28eb9-634c-4f9b-97ef-41d9f45f19b7,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6d65b2b7-2a9a-4e2b-b003-af767e8a544f", "address": "fa:16:3e:a0:05:16", "network": {"id": "25de9e6c-4c02-4658-b076-466681d96605", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1644366708-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a79e09b1c4ae4cc5ab11c3e56ee4f0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d65b2b7-2a", "ovs_interfaceid": "6d65b2b7-2a9a-4e2b-b003-af767e8a544f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 23 03:56:48 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-f72aa7bc-9edd-4cc4-9f97-c0e637f39a9f tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Converting VIF {"id": "6d65b2b7-2a9a-4e2b-b003-af767e8a544f", "address": "fa:16:3e:a0:05:16", "network": {"id": "25de9e6c-4c02-4658-b076-466681d96605", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1644366708-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a79e09b1c4ae4cc5ab11c3e56ee4f0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6d65b2b7-2a", "ovs_interfaceid": "6d65b2b7-2a9a-4e2b-b003-af767e8a544f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:56:48 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-f72aa7bc-9edd-4cc4-9f97-c0e637f39a9f tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a0:05:16,bridge_name='br-int',has_traffic_filtering=True,id=6d65b2b7-2a9a-4e2b-b003-af767e8a544f,network=Network(25de9e6c-4c02-4658-b076-466681d96605),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d65b2b7-2a') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:56:48 user nova-compute[71428]: DEBUG os_vif [None req-f72aa7bc-9edd-4cc4-9f97-c0e637f39a9f tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a0:05:16,bridge_name='br-int',has_traffic_filtering=True,id=6d65b2b7-2a9a-4e2b-b003-af767e8a544f,network=Network(25de9e6c-4c02-4658-b076-466681d96605),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d65b2b7-2a') {{(pid=71428) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 23 03:56:48 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:56:48 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6d65b2b7-2a, bridge=br-int, if_exists=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:56:48 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:56:48 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:56:48 user nova-compute[71428]: INFO os_vif [None req-f72aa7bc-9edd-4cc4-9f97-c0e637f39a9f tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a0:05:16,bridge_name='br-int',has_traffic_filtering=True,id=6d65b2b7-2a9a-4e2b-b003-af767e8a544f,network=Network(25de9e6c-4c02-4658-b076-466681d96605),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6d65b2b7-2a') Apr 23 03:56:48 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-f72aa7bc-9edd-4cc4-9f97-c0e637f39a9f tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Deleting instance files /opt/stack/data/nova/instances/97b28eb9-634c-4f9b-97ef-41d9f45f19b7_del Apr 23 03:56:48 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-f72aa7bc-9edd-4cc4-9f97-c0e637f39a9f tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Deletion of /opt/stack/data/nova/instances/97b28eb9-634c-4f9b-97ef-41d9f45f19b7_del complete Apr 23 03:56:48 user nova-compute[71428]: INFO nova.compute.manager [None req-f72aa7bc-9edd-4cc4-9f97-c0e637f39a9f tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Took 0.86 seconds to destroy the instance on the hypervisor. Apr 23 03:56:48 user nova-compute[71428]: DEBUG oslo.service.loopingcall [None req-f72aa7bc-9edd-4cc4-9f97-c0e637f39a9f tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71428) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 23 03:56:48 user nova-compute[71428]: DEBUG nova.compute.manager [-] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Deallocating network for instance {{(pid=71428) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 23 03:56:48 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] deallocate_for_instance() {{(pid=71428) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 23 03:56:48 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:56:49 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Updating instance_info_cache with network_info: [] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:56:49 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Took 0.49 seconds to deallocate network for instance. Apr 23 03:56:49 user nova-compute[71428]: DEBUG nova.compute.manager [req-79cac052-23db-4f23-9a35-a5ca1b1f7830 req-c117cb53-89f4-4be8-a2d8-f778888025d0 service nova] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Received event network-vif-deleted-6d65b2b7-2a9a-4e2b-b003-af767e8a544f {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:56:49 user nova-compute[71428]: INFO nova.compute.manager [req-79cac052-23db-4f23-9a35-a5ca1b1f7830 req-c117cb53-89f4-4be8-a2d8-f778888025d0 service nova] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Neutron deleted interface 6d65b2b7-2a9a-4e2b-b003-af767e8a544f; detaching it from the instance and deleting it from the info cache Apr 23 03:56:49 user nova-compute[71428]: DEBUG nova.network.neutron [req-79cac052-23db-4f23-9a35-a5ca1b1f7830 req-c117cb53-89f4-4be8-a2d8-f778888025d0 service nova] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Updating instance_info_cache with network_info: [] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:56:49 user nova-compute[71428]: DEBUG nova.compute.manager [req-79cac052-23db-4f23-9a35-a5ca1b1f7830 req-c117cb53-89f4-4be8-a2d8-f778888025d0 service nova] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Detach interface failed, port_id=6d65b2b7-2a9a-4e2b-b003-af767e8a544f, reason: Instance 97b28eb9-634c-4f9b-97ef-41d9f45f19b7 could not be found. {{(pid=71428) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 23 03:56:49 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f72aa7bc-9edd-4cc4-9f97-c0e637f39a9f tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:56:49 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f72aa7bc-9edd-4cc4-9f97-c0e637f39a9f tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:56:49 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-f72aa7bc-9edd-4cc4-9f97-c0e637f39a9f tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:56:49 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-f72aa7bc-9edd-4cc4-9f97-c0e637f39a9f tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:56:49 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f72aa7bc-9edd-4cc4-9f97-c0e637f39a9f tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.159s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:56:49 user nova-compute[71428]: INFO nova.scheduler.client.report [None req-f72aa7bc-9edd-4cc4-9f97-c0e637f39a9f tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Deleted allocations for instance 97b28eb9-634c-4f9b-97ef-41d9f45f19b7 Apr 23 03:56:49 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-f72aa7bc-9edd-4cc4-9f97-c0e637f39a9f tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "97b28eb9-634c-4f9b-97ef-41d9f45f19b7" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.685s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:56:50 user nova-compute[71428]: DEBUG nova.compute.manager [req-bcf7cae6-2252-414e-bbba-0434a4b7fc32 req-49f90395-4ff5-4ec1-98bf-ca7c3349ce0f service nova] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Received event network-vif-plugged-6d65b2b7-2a9a-4e2b-b003-af767e8a544f {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:56:50 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-bcf7cae6-2252-414e-bbba-0434a4b7fc32 req-49f90395-4ff5-4ec1-98bf-ca7c3349ce0f service nova] Acquiring lock "97b28eb9-634c-4f9b-97ef-41d9f45f19b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:56:50 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-bcf7cae6-2252-414e-bbba-0434a4b7fc32 req-49f90395-4ff5-4ec1-98bf-ca7c3349ce0f service nova] Lock "97b28eb9-634c-4f9b-97ef-41d9f45f19b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:56:50 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-bcf7cae6-2252-414e-bbba-0434a4b7fc32 req-49f90395-4ff5-4ec1-98bf-ca7c3349ce0f service nova] Lock "97b28eb9-634c-4f9b-97ef-41d9f45f19b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:56:50 user nova-compute[71428]: DEBUG nova.compute.manager [req-bcf7cae6-2252-414e-bbba-0434a4b7fc32 req-49f90395-4ff5-4ec1-98bf-ca7c3349ce0f service nova] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] No waiting events found dispatching network-vif-plugged-6d65b2b7-2a9a-4e2b-b003-af767e8a544f {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:56:50 user nova-compute[71428]: WARNING nova.compute.manager [req-bcf7cae6-2252-414e-bbba-0434a4b7fc32 req-49f90395-4ff5-4ec1-98bf-ca7c3349ce0f service nova] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Received unexpected event network-vif-plugged-6d65b2b7-2a9a-4e2b-b003-af767e8a544f for instance with vm_state deleted and task_state None. Apr 23 03:56:53 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:56:53 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:56:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:57:03 user nova-compute[71428]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:57:03 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] VM Stopped (Lifecycle Event) Apr 23 03:57:03 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7980f391-fca3-408b-9c3a-a92429aee275 None None] [instance: 97b28eb9-634c-4f9b-97ef-41d9f45f19b7] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:57:03 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:57:08 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:57:11 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:57:13 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:57:13 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:57:14 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:57:14 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71428) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 23 03:57:15 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:57:15 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:57:15 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Cleaning up deleted instances with incomplete migration {{(pid=71428) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 23 03:57:15 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:57:17 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:57:18 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:57:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:57:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:57:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:57:18 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Auditing locally available compute resources for user (node: user) {{(pid=71428) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 23 03:57:18 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:57:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:57:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:57:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:57:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:57:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:57:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:57:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:57:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:57:19 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:57:19 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:57:19 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Hypervisor/Node resource view: name=user free_ram=8942MB free_disk=26.227954864501953GB free_vcpus=10 pci_devices=[{"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71428) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 23 03:57:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:57:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:57:19 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance c8480a7c-b5de-4f66-a4f0-08fc679b0dfd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:57:19 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance ee89a9d5-0507-475c-bea2-e02e34d15710 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:57:19 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Total usable vcpus: 12, total allocated vcpus: 2 {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 23 03:57:19 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Final resource view: name=user phys_ram=16023MB used_ram=768MB phys_disk=40GB used_disk=2GB total_vcpus=12 used_vcpus=2 pci_stats=[] {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 23 03:57:20 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:57:20 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:57:20 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Compute_service record updated for user:user {{(pid=71428) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 23 03:57:20 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.308s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:57:22 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:57:22 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:57:22 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:57:23 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:57:23 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Starting heal instance info cache {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 23 03:57:23 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Rebuilding the list of instances to heal {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 23 03:57:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:57:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "refresh_cache-c8480a7c-b5de-4f66-a4f0-08fc679b0dfd" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:57:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquired lock "refresh_cache-c8480a7c-b5de-4f66-a4f0-08fc679b0dfd" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:57:23 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Forcefully refreshing network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 23 03:57:23 user nova-compute[71428]: DEBUG nova.objects.instance [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lazy-loading 'info_cache' on Instance uuid c8480a7c-b5de-4f66-a4f0-08fc679b0dfd {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:57:24 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Updating instance_info_cache with network_info: [{"id": "e7548d21-c865-4648-a8a6-dba748a01e14", "address": "fa:16:3e:08:69:16", "network": {"id": "25de9e6c-4c02-4658-b076-466681d96605", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1644366708-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a79e09b1c4ae4cc5ab11c3e56ee4f0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape7548d21-c8", "ovs_interfaceid": "e7548d21-c865-4648-a8a6-dba748a01e14", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:57:24 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Releasing lock "refresh_cache-c8480a7c-b5de-4f66-a4f0-08fc679b0dfd" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:57:24 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Updated the network info_cache for instance {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 23 03:57:28 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:57:28 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Cleaning up deleted instances {{(pid=71428) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 23 03:57:28 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] There are 0 instances to clean {{(pid=71428) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 23 03:57:28 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:57:33 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:57:38 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-24658ab6-ae82-4e9b-a232-cca990f99445 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Acquiring lock "c8480a7c-b5de-4f66-a4f0-08fc679b0dfd" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:57:38 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-24658ab6-ae82-4e9b-a232-cca990f99445 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "c8480a7c-b5de-4f66-a4f0-08fc679b0dfd" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:57:38 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-24658ab6-ae82-4e9b-a232-cca990f99445 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Acquiring lock "c8480a7c-b5de-4f66-a4f0-08fc679b0dfd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:57:38 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-24658ab6-ae82-4e9b-a232-cca990f99445 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "c8480a7c-b5de-4f66-a4f0-08fc679b0dfd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:57:38 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-24658ab6-ae82-4e9b-a232-cca990f99445 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "c8480a7c-b5de-4f66-a4f0-08fc679b0dfd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:57:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:57:38 user nova-compute[71428]: INFO nova.compute.manager [None req-24658ab6-ae82-4e9b-a232-cca990f99445 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Terminating instance Apr 23 03:57:38 user nova-compute[71428]: DEBUG nova.compute.manager [None req-24658ab6-ae82-4e9b-a232-cca990f99445 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Start destroying the instance on the hypervisor. {{(pid=71428) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 23 03:57:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:57:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:57:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:57:39 user nova-compute[71428]: DEBUG nova.compute.manager [req-196ab89b-66af-4b75-9818-25e02b57f19b req-f042bc7b-5255-4b45-aa71-6f86392b6979 service nova] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Received event network-vif-unplugged-e7548d21-c865-4648-a8a6-dba748a01e14 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:57:39 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-196ab89b-66af-4b75-9818-25e02b57f19b req-f042bc7b-5255-4b45-aa71-6f86392b6979 service nova] Acquiring lock "c8480a7c-b5de-4f66-a4f0-08fc679b0dfd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:57:39 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-196ab89b-66af-4b75-9818-25e02b57f19b req-f042bc7b-5255-4b45-aa71-6f86392b6979 service nova] Lock "c8480a7c-b5de-4f66-a4f0-08fc679b0dfd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:57:39 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-196ab89b-66af-4b75-9818-25e02b57f19b req-f042bc7b-5255-4b45-aa71-6f86392b6979 service nova] Lock "c8480a7c-b5de-4f66-a4f0-08fc679b0dfd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:57:39 user nova-compute[71428]: DEBUG nova.compute.manager [req-196ab89b-66af-4b75-9818-25e02b57f19b req-f042bc7b-5255-4b45-aa71-6f86392b6979 service nova] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] No waiting events found dispatching network-vif-unplugged-e7548d21-c865-4648-a8a6-dba748a01e14 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:57:39 user nova-compute[71428]: DEBUG nova.compute.manager [req-196ab89b-66af-4b75-9818-25e02b57f19b req-f042bc7b-5255-4b45-aa71-6f86392b6979 service nova] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Received event network-vif-unplugged-e7548d21-c865-4648-a8a6-dba748a01e14 for instance with task_state deleting. {{(pid=71428) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 23 03:57:39 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:57:39 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Instance destroyed successfully. Apr 23 03:57:39 user nova-compute[71428]: DEBUG nova.objects.instance [None req-24658ab6-ae82-4e9b-a232-cca990f99445 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lazy-loading 'resources' on Instance uuid c8480a7c-b5de-4f66-a4f0-08fc679b0dfd {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:57:39 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-24658ab6-ae82-4e9b-a232-cca990f99445 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:49:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-2136327616',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-2136327616',id=9,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-23T03:49:15Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='a79e09b1c4ae4cc5ab11c3e56ee4f0d9',ramdisk_id='',reservation_id='r-uc8k6aib',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-653032906',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-23T03:51:04Z,user_data=None,user_id='99495726467944c38620831fc93e2856',uuid=c8480a7c-b5de-4f66-a4f0-08fc679b0dfd,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e7548d21-c865-4648-a8a6-dba748a01e14", "address": "fa:16:3e:08:69:16", "network": {"id": "25de9e6c-4c02-4658-b076-466681d96605", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1644366708-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a79e09b1c4ae4cc5ab11c3e56ee4f0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape7548d21-c8", "ovs_interfaceid": "e7548d21-c865-4648-a8a6-dba748a01e14", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 23 03:57:39 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-24658ab6-ae82-4e9b-a232-cca990f99445 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Converting VIF {"id": "e7548d21-c865-4648-a8a6-dba748a01e14", "address": "fa:16:3e:08:69:16", "network": {"id": "25de9e6c-4c02-4658-b076-466681d96605", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1644366708-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a79e09b1c4ae4cc5ab11c3e56ee4f0d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape7548d21-c8", "ovs_interfaceid": "e7548d21-c865-4648-a8a6-dba748a01e14", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:57:39 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-24658ab6-ae82-4e9b-a232-cca990f99445 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:08:69:16,bridge_name='br-int',has_traffic_filtering=True,id=e7548d21-c865-4648-a8a6-dba748a01e14,network=Network(25de9e6c-4c02-4658-b076-466681d96605),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7548d21-c8') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:57:39 user nova-compute[71428]: DEBUG os_vif [None req-24658ab6-ae82-4e9b-a232-cca990f99445 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:08:69:16,bridge_name='br-int',has_traffic_filtering=True,id=e7548d21-c865-4648-a8a6-dba748a01e14,network=Network(25de9e6c-4c02-4658-b076-466681d96605),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7548d21-c8') {{(pid=71428) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 23 03:57:39 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:57:39 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape7548d21-c8, bridge=br-int, if_exists=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:57:39 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:57:39 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:57:39 user nova-compute[71428]: INFO os_vif [None req-24658ab6-ae82-4e9b-a232-cca990f99445 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:08:69:16,bridge_name='br-int',has_traffic_filtering=True,id=e7548d21-c865-4648-a8a6-dba748a01e14,network=Network(25de9e6c-4c02-4658-b076-466681d96605),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7548d21-c8') Apr 23 03:57:39 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-24658ab6-ae82-4e9b-a232-cca990f99445 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Deleting instance files /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd_del Apr 23 03:57:39 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-24658ab6-ae82-4e9b-a232-cca990f99445 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Deletion of /opt/stack/data/nova/instances/c8480a7c-b5de-4f66-a4f0-08fc679b0dfd_del complete Apr 23 03:57:39 user nova-compute[71428]: INFO nova.compute.manager [None req-24658ab6-ae82-4e9b-a232-cca990f99445 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Took 0.84 seconds to destroy the instance on the hypervisor. Apr 23 03:57:39 user nova-compute[71428]: DEBUG oslo.service.loopingcall [None req-24658ab6-ae82-4e9b-a232-cca990f99445 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71428) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 23 03:57:39 user nova-compute[71428]: DEBUG nova.compute.manager [-] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Deallocating network for instance {{(pid=71428) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 23 03:57:39 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] deallocate_for_instance() {{(pid=71428) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 23 03:57:40 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Updating instance_info_cache with network_info: [] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:57:40 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Took 0.56 seconds to deallocate network for instance. Apr 23 03:57:40 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-24658ab6-ae82-4e9b-a232-cca990f99445 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:57:40 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-24658ab6-ae82-4e9b-a232-cca990f99445 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:57:40 user nova-compute[71428]: DEBUG nova.compute.manager [req-6ffe909a-0a16-4b63-b2ea-3db131d95fbc req-01af52cf-1b84-4ee9-b5c3-8e39ef16c569 service nova] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Received event network-vif-deleted-e7548d21-c865-4648-a8a6-dba748a01e14 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:57:40 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-24658ab6-ae82-4e9b-a232-cca990f99445 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:57:40 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-24658ab6-ae82-4e9b-a232-cca990f99445 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:57:40 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-24658ab6-ae82-4e9b-a232-cca990f99445 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.148s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:57:40 user nova-compute[71428]: INFO nova.scheduler.client.report [None req-24658ab6-ae82-4e9b-a232-cca990f99445 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Deleted allocations for instance c8480a7c-b5de-4f66-a4f0-08fc679b0dfd Apr 23 03:57:40 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-24658ab6-ae82-4e9b-a232-cca990f99445 tempest-ServerBootFromVolumeStableRescueTest-653032906 tempest-ServerBootFromVolumeStableRescueTest-653032906-project-member] Lock "c8480a7c-b5de-4f66-a4f0-08fc679b0dfd" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.741s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:57:41 user nova-compute[71428]: DEBUG nova.compute.manager [req-f5097bbb-dbdd-4610-8089-a1858566e67a req-b2aae0e8-2b05-42e6-8072-2c901031f206 service nova] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Received event network-vif-plugged-e7548d21-c865-4648-a8a6-dba748a01e14 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:57:41 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-f5097bbb-dbdd-4610-8089-a1858566e67a req-b2aae0e8-2b05-42e6-8072-2c901031f206 service nova] Acquiring lock "c8480a7c-b5de-4f66-a4f0-08fc679b0dfd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:57:41 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-f5097bbb-dbdd-4610-8089-a1858566e67a req-b2aae0e8-2b05-42e6-8072-2c901031f206 service nova] Lock "c8480a7c-b5de-4f66-a4f0-08fc679b0dfd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:57:41 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-f5097bbb-dbdd-4610-8089-a1858566e67a req-b2aae0e8-2b05-42e6-8072-2c901031f206 service nova] Lock "c8480a7c-b5de-4f66-a4f0-08fc679b0dfd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:57:41 user nova-compute[71428]: DEBUG nova.compute.manager [req-f5097bbb-dbdd-4610-8089-a1858566e67a req-b2aae0e8-2b05-42e6-8072-2c901031f206 service nova] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] No waiting events found dispatching network-vif-plugged-e7548d21-c865-4648-a8a6-dba748a01e14 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:57:41 user nova-compute[71428]: WARNING nova.compute.manager [req-f5097bbb-dbdd-4610-8089-a1858566e67a req-b2aae0e8-2b05-42e6-8072-2c901031f206 service nova] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Received unexpected event network-vif-plugged-e7548d21-c865-4648-a8a6-dba748a01e14 for instance with vm_state deleted and task_state None. Apr 23 03:57:44 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:57:48 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:57:49 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:57:54 user nova-compute[71428]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:57:54 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] VM Stopped (Lifecycle Event) Apr 23 03:57:54 user nova-compute[71428]: DEBUG nova.compute.manager [None req-124bc1e4-a1e5-4962-9005-c9ec52f67922 None None] [instance: c8480a7c-b5de-4f66-a4f0-08fc679b0dfd] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:57:54 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:57:59 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:04 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:58:04 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:04 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71428) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 23 03:58:04 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 03:58:04 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 03:58:04 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:09 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:12 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Acquiring lock "994da665-4a5f-41ce-a104-febce8be2557" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:58:12 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lock "994da665-4a5f-41ce-a104-febce8be2557" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:58:12 user nova-compute[71428]: DEBUG nova.compute.manager [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Starting instance... {{(pid=71428) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 23 03:58:12 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:58:12 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:58:12 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71428) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 23 03:58:12 user nova-compute[71428]: INFO nova.compute.claims [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Claim successful on node user Apr 23 03:58:12 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Refreshing inventories for resource provider 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 23 03:58:12 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Updating ProviderTree inventory for provider 3017e09c-9289-4a8e-8061-3ff90149e985 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 23 03:58:12 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Updating inventory in ProviderTree for provider 3017e09c-9289-4a8e-8061-3ff90149e985 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 23 03:58:12 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Refreshing aggregate associations for resource provider 3017e09c-9289-4a8e-8061-3ff90149e985, aggregates: None {{(pid=71428) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 23 03:58:12 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Refreshing trait associations for resource provider 3017e09c-9289-4a8e-8061-3ff90149e985, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE41,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_RESCUE_BFV,COMPUTE_NODE,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_GRAPHICS_MODEL_CIRRUS {{(pid=71428) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 23 03:58:12 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:58:12 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:58:12 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:58:12 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.353s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:58:12 user nova-compute[71428]: DEBUG nova.compute.manager [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Start building networks asynchronously for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 23 03:58:12 user nova-compute[71428]: DEBUG nova.compute.manager [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Allocating IP information in the background. {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 23 03:58:12 user nova-compute[71428]: DEBUG nova.network.neutron [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: 994da665-4a5f-41ce-a104-febce8be2557] allocate_for_instance() {{(pid=71428) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 23 03:58:12 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 23 03:58:12 user nova-compute[71428]: DEBUG nova.compute.manager [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Start building block device mappings for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 23 03:58:12 user nova-compute[71428]: DEBUG nova.policy [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '45d9ae4a3b00492fa2fc16a22a34df09', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '39d6cb4cfcb2413f8039ee9febf1c046', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71428) authorize /opt/stack/nova/nova/policy.py:203}} Apr 23 03:58:12 user nova-compute[71428]: DEBUG nova.compute.manager [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Start spawning the instance on the hypervisor. {{(pid=71428) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 23 03:58:12 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Creating instance directory {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 23 03:58:12 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Creating image(s) Apr 23 03:58:13 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Acquiring lock "/opt/stack/data/nova/instances/994da665-4a5f-41ce-a104-febce8be2557/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:58:13 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lock "/opt/stack/data/nova/instances/994da665-4a5f-41ce-a104-febce8be2557/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:58:13 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lock "/opt/stack/data/nova/instances/994da665-4a5f-41ce-a104-febce8be2557/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:58:13 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:58:13 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.146s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:58:13 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Acquiring lock "935dc3474dddbde110531a31510941caddc0ae83" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:58:13 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:58:13 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:58:13 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.139s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:58:13 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/994da665-4a5f-41ce-a104-febce8be2557/disk 1073741824 {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:58:13 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/994da665-4a5f-41ce-a104-febce8be2557/disk 1073741824" returned: 0 in 0.050s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:58:13 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.197s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:58:13 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:58:13 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.177s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:58:13 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Checking if we can resize image /opt/stack/data/nova/instances/994da665-4a5f-41ce-a104-febce8be2557/disk. size=1073741824 {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 23 03:58:13 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/994da665-4a5f-41ce-a104-febce8be2557/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:58:13 user nova-compute[71428]: DEBUG nova.network.neutron [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Successfully created port: 7f5011fe-2339-41ec-b6db-535375a33a9d {{(pid=71428) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 23 03:58:13 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/994da665-4a5f-41ce-a104-febce8be2557/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:58:13 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Cannot resize image /opt/stack/data/nova/instances/994da665-4a5f-41ce-a104-febce8be2557/disk to a smaller size. {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 23 03:58:13 user nova-compute[71428]: DEBUG nova.objects.instance [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lazy-loading 'migration_context' on Instance uuid 994da665-4a5f-41ce-a104-febce8be2557 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:58:13 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Created local disks {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 23 03:58:13 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Ensure instance console log exists: /opt/stack/data/nova/instances/994da665-4a5f-41ce-a104-febce8be2557/console.log {{(pid=71428) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 23 03:58:13 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:58:13 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:58:13 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG nova.network.neutron [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Successfully updated port: 7f5011fe-2339-41ec-b6db-535375a33a9d {{(pid=71428) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Acquiring lock "refresh_cache-994da665-4a5f-41ce-a104-febce8be2557" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Acquired lock "refresh_cache-994da665-4a5f-41ce-a104-febce8be2557" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG nova.network.neutron [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Building network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG nova.compute.manager [req-e4bca89c-723f-4953-94a2-74aa4aa3fe7b req-eeb16672-a5c2-44a8-9d00-61498b377236 service nova] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Received event network-changed-7f5011fe-2339-41ec-b6db-535375a33a9d {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG nova.compute.manager [req-e4bca89c-723f-4953-94a2-74aa4aa3fe7b req-eeb16672-a5c2-44a8-9d00-61498b377236 service nova] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Refreshing instance network info cache due to event network-changed-7f5011fe-2339-41ec-b6db-535375a33a9d. {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-e4bca89c-723f-4953-94a2-74aa4aa3fe7b req-eeb16672-a5c2-44a8-9d00-61498b377236 service nova] Acquiring lock "refresh_cache-994da665-4a5f-41ce-a104-febce8be2557" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG nova.network.neutron [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Instance cache missing network info. {{(pid=71428) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG nova.network.neutron [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Updating instance_info_cache with network_info: [{"id": "7f5011fe-2339-41ec-b6db-535375a33a9d", "address": "fa:16:3e:8e:ca:f0", "network": {"id": "083ef998-5361-46b7-827f-d0f086e9e552", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1112168066-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "39d6cb4cfcb2413f8039ee9febf1c046", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f5011fe-23", "ovs_interfaceid": "7f5011fe-2339-41ec-b6db-535375a33a9d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71428) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Releasing lock "refresh_cache-994da665-4a5f-41ce-a104-febce8be2557" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG nova.compute.manager [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Instance network_info: |[{"id": "7f5011fe-2339-41ec-b6db-535375a33a9d", "address": "fa:16:3e:8e:ca:f0", "network": {"id": "083ef998-5361-46b7-827f-d0f086e9e552", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1112168066-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "39d6cb4cfcb2413f8039ee9febf1c046", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f5011fe-23", "ovs_interfaceid": "7f5011fe-2339-41ec-b6db-535375a33a9d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-e4bca89c-723f-4953-94a2-74aa4aa3fe7b req-eeb16672-a5c2-44a8-9d00-61498b377236 service nova] Acquired lock "refresh_cache-994da665-4a5f-41ce-a104-febce8be2557" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG nova.network.neutron [req-e4bca89c-723f-4953-94a2-74aa4aa3fe7b req-eeb16672-a5c2-44a8-9d00-61498b377236 service nova] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Refreshing network info cache for port 7f5011fe-2339-41ec-b6db-535375a33a9d {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Start _get_guest_xml network_info=[{"id": "7f5011fe-2339-41ec-b6db-535375a33a9d", "address": "fa:16:3e:8e:ca:f0", "network": {"id": "083ef998-5361-46b7-827f-d0f086e9e552", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1112168066-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "39d6cb4cfcb2413f8039ee9febf1c046", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f5011fe-23", "ovs_interfaceid": "7f5011fe-2339-41ec-b6db-535375a33a9d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'image_id': 'e6127373-9931-4277-9458-eceef653ea1e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 23 03:58:14 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:58:14 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:58:14 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71428) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-23T03:43:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=), allow threads: True {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Flavor limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Image limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Flavor pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Image pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Got 1 possible topologies {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:58:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1132169926',display_name='tempest-ServersNegativeTestJSON-server-1132169926',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1132169926',id=20,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='39d6cb4cfcb2413f8039ee9febf1c046',ramdisk_id='',reservation_id='r-n7idxnsu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1115615559',owner_user_name='tempest-ServersNegativeTestJSON-1115615559-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:58:13Z,user_data=None,user_id='45d9ae4a3b00492fa2fc16a22a34df09',uuid=994da665-4a5f-41ce-a104-febce8be2557,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7f5011fe-2339-41ec-b6db-535375a33a9d", "address": "fa:16:3e:8e:ca:f0", "network": {"id": "083ef998-5361-46b7-827f-d0f086e9e552", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1112168066-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "39d6cb4cfcb2413f8039ee9febf1c046", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f5011fe-23", "ovs_interfaceid": "7f5011fe-2339-41ec-b6db-535375a33a9d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71428) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Converting VIF {"id": "7f5011fe-2339-41ec-b6db-535375a33a9d", "address": "fa:16:3e:8e:ca:f0", "network": {"id": "083ef998-5361-46b7-827f-d0f086e9e552", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1112168066-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "39d6cb4cfcb2413f8039ee9febf1c046", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f5011fe-23", "ovs_interfaceid": "7f5011fe-2339-41ec-b6db-535375a33a9d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8e:ca:f0,bridge_name='br-int',has_traffic_filtering=True,id=7f5011fe-2339-41ec-b6db-535375a33a9d,network=Network(083ef998-5361-46b7-827f-d0f086e9e552),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f5011fe-23') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG nova.objects.instance [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lazy-loading 'pci_devices' on Instance uuid 994da665-4a5f-41ce-a104-febce8be2557 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: 994da665-4a5f-41ce-a104-febce8be2557] End _get_guest_xml xml= Apr 23 03:58:14 user nova-compute[71428]: 994da665-4a5f-41ce-a104-febce8be2557 Apr 23 03:58:14 user nova-compute[71428]: instance-00000014 Apr 23 03:58:14 user nova-compute[71428]: 131072 Apr 23 03:58:14 user nova-compute[71428]: 1 Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: tempest-ServersNegativeTestJSON-server-1132169926 Apr 23 03:58:14 user nova-compute[71428]: 2023-04-23 03:58:14 Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: 128 Apr 23 03:58:14 user nova-compute[71428]: 1 Apr 23 03:58:14 user nova-compute[71428]: 0 Apr 23 03:58:14 user nova-compute[71428]: 0 Apr 23 03:58:14 user nova-compute[71428]: 1 Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: tempest-ServersNegativeTestJSON-1115615559-project-member Apr 23 03:58:14 user nova-compute[71428]: tempest-ServersNegativeTestJSON-1115615559 Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: OpenStack Foundation Apr 23 03:58:14 user nova-compute[71428]: OpenStack Nova Apr 23 03:58:14 user nova-compute[71428]: 0.0.0 Apr 23 03:58:14 user nova-compute[71428]: 994da665-4a5f-41ce-a104-febce8be2557 Apr 23 03:58:14 user nova-compute[71428]: 994da665-4a5f-41ce-a104-febce8be2557 Apr 23 03:58:14 user nova-compute[71428]: Virtual Machine Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: hvm Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Nehalem Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: /dev/urandom Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: Apr 23 03:58:14 user nova-compute[71428]: {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:58:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1132169926',display_name='tempest-ServersNegativeTestJSON-server-1132169926',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1132169926',id=20,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='39d6cb4cfcb2413f8039ee9febf1c046',ramdisk_id='',reservation_id='r-n7idxnsu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1115615559',owner_user_name='tempest-ServersNegativeTestJSON-1115615559-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:58:13Z,user_data=None,user_id='45d9ae4a3b00492fa2fc16a22a34df09',uuid=994da665-4a5f-41ce-a104-febce8be2557,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7f5011fe-2339-41ec-b6db-535375a33a9d", "address": "fa:16:3e:8e:ca:f0", "network": {"id": "083ef998-5361-46b7-827f-d0f086e9e552", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1112168066-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "39d6cb4cfcb2413f8039ee9febf1c046", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f5011fe-23", "ovs_interfaceid": "7f5011fe-2339-41ec-b6db-535375a33a9d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Converting VIF {"id": "7f5011fe-2339-41ec-b6db-535375a33a9d", "address": "fa:16:3e:8e:ca:f0", "network": {"id": "083ef998-5361-46b7-827f-d0f086e9e552", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1112168066-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "39d6cb4cfcb2413f8039ee9febf1c046", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f5011fe-23", "ovs_interfaceid": "7f5011fe-2339-41ec-b6db-535375a33a9d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8e:ca:f0,bridge_name='br-int',has_traffic_filtering=True,id=7f5011fe-2339-41ec-b6db-535375a33a9d,network=Network(083ef998-5361-46b7-827f-d0f086e9e552),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f5011fe-23') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG os_vif [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8e:ca:f0,bridge_name='br-int',has_traffic_filtering=True,id=7f5011fe-2339-41ec-b6db-535375a33a9d,network=Network(083ef998-5361-46b7-827f-d0f086e9e552),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f5011fe-23') {{(pid=71428) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7f5011fe-23, may_exist=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7f5011fe-23, col_values=(('external_ids', {'iface-id': '7f5011fe-2339-41ec-b6db-535375a33a9d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8e:ca:f0', 'vm-uuid': '994da665-4a5f-41ce-a104-febce8be2557'}),)) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:14 user nova-compute[71428]: INFO os_vif [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8e:ca:f0,bridge_name='br-int',has_traffic_filtering=True,id=7f5011fe-2339-41ec-b6db-535375a33a9d,network=Network(083ef998-5361-46b7-827f-d0f086e9e552),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f5011fe-23') Apr 23 03:58:14 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] No BDM found with device name vda, not building metadata. {{(pid=71428) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 23 03:58:14 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] No VIF found with MAC fa:16:3e:8e:ca:f0, not building metadata {{(pid=71428) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 23 03:58:15 user nova-compute[71428]: DEBUG nova.network.neutron [req-e4bca89c-723f-4953-94a2-74aa4aa3fe7b req-eeb16672-a5c2-44a8-9d00-61498b377236 service nova] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Updated VIF entry in instance network info cache for port 7f5011fe-2339-41ec-b6db-535375a33a9d. {{(pid=71428) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 23 03:58:15 user nova-compute[71428]: DEBUG nova.network.neutron [req-e4bca89c-723f-4953-94a2-74aa4aa3fe7b req-eeb16672-a5c2-44a8-9d00-61498b377236 service nova] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Updating instance_info_cache with network_info: [{"id": "7f5011fe-2339-41ec-b6db-535375a33a9d", "address": "fa:16:3e:8e:ca:f0", "network": {"id": "083ef998-5361-46b7-827f-d0f086e9e552", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1112168066-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "39d6cb4cfcb2413f8039ee9febf1c046", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f5011fe-23", "ovs_interfaceid": "7f5011fe-2339-41ec-b6db-535375a33a9d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:58:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-e4bca89c-723f-4953-94a2-74aa4aa3fe7b req-eeb16672-a5c2-44a8-9d00-61498b377236 service nova] Releasing lock "refresh_cache-994da665-4a5f-41ce-a104-febce8be2557" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:58:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:16 user nova-compute[71428]: DEBUG nova.compute.manager [req-36325053-1a26-476a-8a18-c45778c789d1 req-758e6ec2-32ca-4ed6-ad3f-997ef953d158 service nova] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Received event network-vif-plugged-7f5011fe-2339-41ec-b6db-535375a33a9d {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:58:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-36325053-1a26-476a-8a18-c45778c789d1 req-758e6ec2-32ca-4ed6-ad3f-997ef953d158 service nova] Acquiring lock "994da665-4a5f-41ce-a104-febce8be2557-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:58:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-36325053-1a26-476a-8a18-c45778c789d1 req-758e6ec2-32ca-4ed6-ad3f-997ef953d158 service nova] Lock "994da665-4a5f-41ce-a104-febce8be2557-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:58:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-36325053-1a26-476a-8a18-c45778c789d1 req-758e6ec2-32ca-4ed6-ad3f-997ef953d158 service nova] Lock "994da665-4a5f-41ce-a104-febce8be2557-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:58:16 user nova-compute[71428]: DEBUG nova.compute.manager [req-36325053-1a26-476a-8a18-c45778c789d1 req-758e6ec2-32ca-4ed6-ad3f-997ef953d158 service nova] [instance: 994da665-4a5f-41ce-a104-febce8be2557] No waiting events found dispatching network-vif-plugged-7f5011fe-2339-41ec-b6db-535375a33a9d {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:58:16 user nova-compute[71428]: WARNING nova.compute.manager [req-36325053-1a26-476a-8a18-c45778c789d1 req-758e6ec2-32ca-4ed6-ad3f-997ef953d158 service nova] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Received unexpected event network-vif-plugged-7f5011fe-2339-41ec-b6db-535375a33a9d for instance with vm_state building and task_state spawning. Apr 23 03:58:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:17 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:58:17 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:58:18 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Resumed> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:58:18 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 994da665-4a5f-41ce-a104-febce8be2557] VM Resumed (Lifecycle Event) Apr 23 03:58:18 user nova-compute[71428]: DEBUG nova.compute.manager [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Instance event wait completed in 0 seconds for {{(pid=71428) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 23 03:58:18 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Guest created on hypervisor {{(pid=71428) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 23 03:58:18 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Instance spawned successfully. Apr 23 03:58:18 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 23 03:58:18 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:58:18 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:58:18 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Found default for hw_cdrom_bus of ide {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:58:18 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Found default for hw_disk_bus of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:58:18 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Found default for hw_input_bus of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:58:18 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Found default for hw_pointer_model of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:58:18 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Found default for hw_video_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:58:18 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Found default for hw_vif_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:58:18 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 994da665-4a5f-41ce-a104-febce8be2557] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 03:58:18 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Started> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:58:18 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 994da665-4a5f-41ce-a104-febce8be2557] VM Started (Lifecycle Event) Apr 23 03:58:18 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:58:18 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:58:18 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 994da665-4a5f-41ce-a104-febce8be2557] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 03:58:18 user nova-compute[71428]: INFO nova.compute.manager [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Took 5.34 seconds to spawn the instance on the hypervisor. Apr 23 03:58:18 user nova-compute[71428]: DEBUG nova.compute.manager [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:58:18 user nova-compute[71428]: INFO nova.compute.manager [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Took 6.00 seconds to build instance. Apr 23 03:58:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-de3c3773-3eba-41cb-80b6-fb57a083dcfa tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lock "994da665-4a5f-41ce-a104-febce8be2557" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.108s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:58:18 user nova-compute[71428]: DEBUG nova.compute.manager [req-1131464c-573c-4596-8396-371b9ffa03fb req-dec2ac02-566f-4170-b7cf-4eb19b715313 service nova] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Received event network-vif-plugged-7f5011fe-2339-41ec-b6db-535375a33a9d {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:58:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-1131464c-573c-4596-8396-371b9ffa03fb req-dec2ac02-566f-4170-b7cf-4eb19b715313 service nova] Acquiring lock "994da665-4a5f-41ce-a104-febce8be2557-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:58:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-1131464c-573c-4596-8396-371b9ffa03fb req-dec2ac02-566f-4170-b7cf-4eb19b715313 service nova] Lock "994da665-4a5f-41ce-a104-febce8be2557-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:58:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-1131464c-573c-4596-8396-371b9ffa03fb req-dec2ac02-566f-4170-b7cf-4eb19b715313 service nova] Lock "994da665-4a5f-41ce-a104-febce8be2557-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:58:18 user nova-compute[71428]: DEBUG nova.compute.manager [req-1131464c-573c-4596-8396-371b9ffa03fb req-dec2ac02-566f-4170-b7cf-4eb19b715313 service nova] [instance: 994da665-4a5f-41ce-a104-febce8be2557] No waiting events found dispatching network-vif-plugged-7f5011fe-2339-41ec-b6db-535375a33a9d {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:58:18 user nova-compute[71428]: WARNING nova.compute.manager [req-1131464c-573c-4596-8396-371b9ffa03fb req-dec2ac02-566f-4170-b7cf-4eb19b715313 service nova] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Received unexpected event network-vif-plugged-7f5011fe-2339-41ec-b6db-535375a33a9d for instance with vm_state active and task_state None. Apr 23 03:58:18 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:58:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:58:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:58:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:58:18 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Auditing locally available compute resources for user (node: user) {{(pid=71428) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 23 03:58:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:58:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:58:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:58:18 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:58:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/994da665-4a5f-41ce-a104-febce8be2557/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:58:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/994da665-4a5f-41ce-a104-febce8be2557/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:58:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/994da665-4a5f-41ce-a104-febce8be2557/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:58:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/994da665-4a5f-41ce-a104-febce8be2557/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:58:19 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:58:19 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:58:19 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Hypervisor/Node resource view: name=user free_ram=8985MB free_disk=26.28176498413086GB free_vcpus=10 pci_devices=[{"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71428) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 23 03:58:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:58:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:58:19 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:19 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance ee89a9d5-0507-475c-bea2-e02e34d15710 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:58:19 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 994da665-4a5f-41ce-a104-febce8be2557 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:58:19 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Total usable vcpus: 12, total allocated vcpus: 2 {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 23 03:58:19 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Final resource view: name=user phys_ram=16023MB used_ram=768MB phys_disk=40GB used_disk=2GB total_vcpus=12 used_vcpus=2 pci_stats=[] {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 23 03:58:19 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:58:19 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:58:19 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Compute_service record updated for user:user {{(pid=71428) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 23 03:58:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.206s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:58:22 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:58:22 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:58:23 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:58:23 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Starting heal instance info cache {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 23 03:58:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "refresh_cache-ee89a9d5-0507-475c-bea2-e02e34d15710" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:58:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquired lock "refresh_cache-ee89a9d5-0507-475c-bea2-e02e34d15710" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:58:23 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Forcefully refreshing network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 23 03:58:24 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Updating instance_info_cache with network_info: [{"id": "8948ed3a-de42-41ef-b7ca-c34a2d405b09", "address": "fa:16:3e:d2:42:a8", "network": {"id": "083ef998-5361-46b7-827f-d0f086e9e552", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1112168066-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "39d6cb4cfcb2413f8039ee9febf1c046", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8948ed3a-de", "ovs_interfaceid": "8948ed3a-de42-41ef-b7ca-c34a2d405b09", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:58:24 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Releasing lock "refresh_cache-ee89a9d5-0507-475c-bea2-e02e34d15710" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:58:24 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Updated the network info_cache for instance {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 23 03:58:24 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:58:24 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:58:24 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:58:24 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:24 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71428) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 23 03:58:24 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 03:58:24 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 03:58:24 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:29 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:58:31 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:33 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:34 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:36 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:39 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:40 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Acquiring lock "27abe5af-8845-40e6-a9c3-12399369b117" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:58:40 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lock "27abe5af-8845-40e6-a9c3-12399369b117" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:58:40 user nova-compute[71428]: DEBUG nova.compute.manager [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Starting instance... {{(pid=71428) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 23 03:58:40 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:58:40 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:58:40 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71428) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 23 03:58:40 user nova-compute[71428]: INFO nova.compute.claims [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Claim successful on node user Apr 23 03:58:40 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:58:40 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:58:40 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.239s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:58:40 user nova-compute[71428]: DEBUG nova.compute.manager [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Start building networks asynchronously for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 23 03:58:40 user nova-compute[71428]: DEBUG nova.compute.manager [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Allocating IP information in the background. {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 23 03:58:40 user nova-compute[71428]: DEBUG nova.network.neutron [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] allocate_for_instance() {{(pid=71428) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 23 03:58:40 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 23 03:58:40 user nova-compute[71428]: DEBUG nova.compute.manager [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Start building block device mappings for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 23 03:58:40 user nova-compute[71428]: DEBUG nova.policy [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '84b2f522c7164c5c934b8bbf113b542a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'db4b7bdc478946e2ad0a2060b433b42c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71428) authorize /opt/stack/nova/nova/policy.py:203}} Apr 23 03:58:40 user nova-compute[71428]: DEBUG nova.compute.manager [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Start spawning the instance on the hypervisor. {{(pid=71428) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 23 03:58:40 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Creating instance directory {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 23 03:58:40 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Creating image(s) Apr 23 03:58:40 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Acquiring lock "/opt/stack/data/nova/instances/27abe5af-8845-40e6-a9c3-12399369b117/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:58:40 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lock "/opt/stack/data/nova/instances/27abe5af-8845-40e6-a9c3-12399369b117/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:58:40 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lock "/opt/stack/data/nova/instances/27abe5af-8845-40e6-a9c3-12399369b117/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:58:40 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:58:40 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Acquiring lock "7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:58:40 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lock "7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:58:40 user nova-compute[71428]: DEBUG nova.compute.manager [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Starting instance... {{(pid=71428) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.147s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Acquiring lock "935dc3474dddbde110531a31510941caddc0ae83" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71428) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 23 03:58:41 user nova-compute[71428]: INFO nova.compute.claims [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Claim successful on node user Apr 23 03:58:41 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.143s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/27abe5af-8845-40e6-a9c3-12399369b117/disk 1073741824 {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/27abe5af-8845-40e6-a9c3-12399369b117/disk 1073741824" returned: 0 in 0.051s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.200s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.139s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Checking if we can resize image /opt/stack/data/nova/instances/27abe5af-8845-40e6-a9c3-12399369b117/disk. size=1073741824 {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/27abe5af-8845-40e6-a9c3-12399369b117/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG nova.network.neutron [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Successfully created port: 85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d {{(pid=71428) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.391s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG nova.compute.manager [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Start building networks asynchronously for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/27abe5af-8845-40e6-a9c3-12399369b117/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Cannot resize image /opt/stack/data/nova/instances/27abe5af-8845-40e6-a9c3-12399369b117/disk to a smaller size. {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG nova.objects.instance [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lazy-loading 'migration_context' on Instance uuid 27abe5af-8845-40e6-a9c3-12399369b117 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Created local disks {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Ensure instance console log exists: /opt/stack/data/nova/instances/27abe5af-8845-40e6-a9c3-12399369b117/console.log {{(pid=71428) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG nova.compute.manager [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Allocating IP information in the background. {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG nova.network.neutron [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] allocate_for_instance() {{(pid=71428) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 23 03:58:41 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 23 03:58:41 user nova-compute[71428]: DEBUG nova.compute.manager [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Start building block device mappings for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG nova.compute.manager [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Start spawning the instance on the hypervisor. {{(pid=71428) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Creating instance directory {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 23 03:58:41 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Creating image(s) Apr 23 03:58:41 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Acquiring lock "/opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lock "/opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lock "/opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG nova.policy [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '84b2f522c7164c5c934b8bbf113b542a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'db4b7bdc478946e2ad0a2060b433b42c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71428) authorize /opt/stack/nova/nova/policy.py:203}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.138s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Acquiring lock "935dc3474dddbde110531a31510941caddc0ae83" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:58:41 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:58:42 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.155s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:58:42 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk 1073741824 {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:58:42 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk 1073741824" returned: 0 in 0.048s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:58:42 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.210s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:58:42 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:58:42 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.141s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:58:42 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Checking if we can resize image /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk. size=1073741824 {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 23 03:58:42 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:58:42 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:58:42 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Cannot resize image /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk to a smaller size. {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 23 03:58:42 user nova-compute[71428]: DEBUG nova.objects.instance [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lazy-loading 'migration_context' on Instance uuid 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:58:42 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Created local disks {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 23 03:58:42 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Ensure instance console log exists: /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/console.log {{(pid=71428) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 23 03:58:42 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:58:42 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.002s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:58:42 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:58:42 user nova-compute[71428]: DEBUG nova.network.neutron [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Successfully updated port: 85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d {{(pid=71428) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 23 03:58:42 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Acquiring lock "refresh_cache-27abe5af-8845-40e6-a9c3-12399369b117" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:58:42 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Acquired lock "refresh_cache-27abe5af-8845-40e6-a9c3-12399369b117" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:58:42 user nova-compute[71428]: DEBUG nova.network.neutron [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Building network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 23 03:58:42 user nova-compute[71428]: DEBUG nova.compute.manager [req-6d097eb4-0646-491a-b596-fe8cd5b1d987 req-5723d892-e272-4888-a97d-8f023baf3682 service nova] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Received event network-changed-85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:58:42 user nova-compute[71428]: DEBUG nova.compute.manager [req-6d097eb4-0646-491a-b596-fe8cd5b1d987 req-5723d892-e272-4888-a97d-8f023baf3682 service nova] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Refreshing instance network info cache due to event network-changed-85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d. {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 23 03:58:42 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-6d097eb4-0646-491a-b596-fe8cd5b1d987 req-5723d892-e272-4888-a97d-8f023baf3682 service nova] Acquiring lock "refresh_cache-27abe5af-8845-40e6-a9c3-12399369b117" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:58:42 user nova-compute[71428]: DEBUG nova.network.neutron [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Instance cache missing network info. {{(pid=71428) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 23 03:58:42 user nova-compute[71428]: DEBUG nova.network.neutron [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Successfully created port: c79b33fd-e966-4247-a5e7-3b6ee20d635c {{(pid=71428) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 23 03:58:43 user nova-compute[71428]: DEBUG nova.network.neutron [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Updating instance_info_cache with network_info: [{"id": "85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d", "address": "fa:16:3e:ae:5c:9d", "network": {"id": "5aff0e90-d0f3-4716-9f61-98dc8c049108", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-790345692-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "db4b7bdc478946e2ad0a2060b433b42c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap85c6dc8b-d8", "ovs_interfaceid": "85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:58:43 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Releasing lock "refresh_cache-27abe5af-8845-40e6-a9c3-12399369b117" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:58:43 user nova-compute[71428]: DEBUG nova.compute.manager [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Instance network_info: |[{"id": "85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d", "address": "fa:16:3e:ae:5c:9d", "network": {"id": "5aff0e90-d0f3-4716-9f61-98dc8c049108", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-790345692-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "db4b7bdc478946e2ad0a2060b433b42c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap85c6dc8b-d8", "ovs_interfaceid": "85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 23 03:58:43 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-6d097eb4-0646-491a-b596-fe8cd5b1d987 req-5723d892-e272-4888-a97d-8f023baf3682 service nova] Acquired lock "refresh_cache-27abe5af-8845-40e6-a9c3-12399369b117" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:58:43 user nova-compute[71428]: DEBUG nova.network.neutron [req-6d097eb4-0646-491a-b596-fe8cd5b1d987 req-5723d892-e272-4888-a97d-8f023baf3682 service nova] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Refreshing network info cache for port 85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 23 03:58:43 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Start _get_guest_xml network_info=[{"id": "85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d", "address": "fa:16:3e:ae:5c:9d", "network": {"id": "5aff0e90-d0f3-4716-9f61-98dc8c049108", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-790345692-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "db4b7bdc478946e2ad0a2060b433b42c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap85c6dc8b-d8", "ovs_interfaceid": "85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'image_id': 'e6127373-9931-4277-9458-eceef653ea1e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 23 03:58:43 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:58:43 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:58:43 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71428) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 23 03:58:43 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-23T03:43:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=), allow threads: True {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 23 03:58:43 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Flavor limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 23 03:58:43 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Image limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 23 03:58:43 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Flavor pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 23 03:58:43 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Image pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 23 03:58:43 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 23 03:58:43 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 23 03:58:43 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 23 03:58:43 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Got 1 possible topologies {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 23 03:58:43 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 23 03:58:43 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 23 03:58:43 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:58:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1213396490',display_name='tempest-ServerRescueNegativeTestJSON-server-1213396490',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-1213396490',id=21,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='db4b7bdc478946e2ad0a2060b433b42c',ramdisk_id='',reservation_id='r-u91j4iwy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-509507797',owner_user_name='tempest-ServerRescueNegativeTestJSON-509507797-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:58:41Z,user_data=None,user_id='84b2f522c7164c5c934b8bbf113b542a',uuid=27abe5af-8845-40e6-a9c3-12399369b117,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d", "address": "fa:16:3e:ae:5c:9d", "network": {"id": "5aff0e90-d0f3-4716-9f61-98dc8c049108", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-790345692-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "db4b7bdc478946e2ad0a2060b433b42c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap85c6dc8b-d8", "ovs_interfaceid": "85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71428) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 23 03:58:43 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Converting VIF {"id": "85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d", "address": "fa:16:3e:ae:5c:9d", "network": {"id": "5aff0e90-d0f3-4716-9f61-98dc8c049108", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-790345692-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "db4b7bdc478946e2ad0a2060b433b42c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap85c6dc8b-d8", "ovs_interfaceid": "85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:58:43 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ae:5c:9d,bridge_name='br-int',has_traffic_filtering=True,id=85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d,network=Network(5aff0e90-d0f3-4716-9f61-98dc8c049108),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap85c6dc8b-d8') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:58:43 user nova-compute[71428]: DEBUG nova.objects.instance [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lazy-loading 'pci_devices' on Instance uuid 27abe5af-8845-40e6-a9c3-12399369b117 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:58:43 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] End _get_guest_xml xml= Apr 23 03:58:43 user nova-compute[71428]: 27abe5af-8845-40e6-a9c3-12399369b117 Apr 23 03:58:43 user nova-compute[71428]: instance-00000015 Apr 23 03:58:43 user nova-compute[71428]: 131072 Apr 23 03:58:43 user nova-compute[71428]: 1 Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: tempest-ServerRescueNegativeTestJSON-server-1213396490 Apr 23 03:58:43 user nova-compute[71428]: 2023-04-23 03:58:43 Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: 128 Apr 23 03:58:43 user nova-compute[71428]: 1 Apr 23 03:58:43 user nova-compute[71428]: 0 Apr 23 03:58:43 user nova-compute[71428]: 0 Apr 23 03:58:43 user nova-compute[71428]: 1 Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: tempest-ServerRescueNegativeTestJSON-509507797-project-member Apr 23 03:58:43 user nova-compute[71428]: tempest-ServerRescueNegativeTestJSON-509507797 Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: OpenStack Foundation Apr 23 03:58:43 user nova-compute[71428]: OpenStack Nova Apr 23 03:58:43 user nova-compute[71428]: 0.0.0 Apr 23 03:58:43 user nova-compute[71428]: 27abe5af-8845-40e6-a9c3-12399369b117 Apr 23 03:58:43 user nova-compute[71428]: 27abe5af-8845-40e6-a9c3-12399369b117 Apr 23 03:58:43 user nova-compute[71428]: Virtual Machine Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: hvm Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Nehalem Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: /dev/urandom Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: Apr 23 03:58:43 user nova-compute[71428]: {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 23 03:58:43 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:58:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1213396490',display_name='tempest-ServerRescueNegativeTestJSON-server-1213396490',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-1213396490',id=21,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='db4b7bdc478946e2ad0a2060b433b42c',ramdisk_id='',reservation_id='r-u91j4iwy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-509507797',owner_user_name='tempest-ServerRescueNegativeTestJSON-509507797-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:58:41Z,user_data=None,user_id='84b2f522c7164c5c934b8bbf113b542a',uuid=27abe5af-8845-40e6-a9c3-12399369b117,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d", "address": "fa:16:3e:ae:5c:9d", "network": {"id": "5aff0e90-d0f3-4716-9f61-98dc8c049108", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-790345692-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "db4b7bdc478946e2ad0a2060b433b42c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap85c6dc8b-d8", "ovs_interfaceid": "85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 23 03:58:43 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Converting VIF {"id": "85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d", "address": "fa:16:3e:ae:5c:9d", "network": {"id": "5aff0e90-d0f3-4716-9f61-98dc8c049108", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-790345692-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "db4b7bdc478946e2ad0a2060b433b42c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap85c6dc8b-d8", "ovs_interfaceid": "85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:58:43 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ae:5c:9d,bridge_name='br-int',has_traffic_filtering=True,id=85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d,network=Network(5aff0e90-d0f3-4716-9f61-98dc8c049108),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap85c6dc8b-d8') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:58:43 user nova-compute[71428]: DEBUG os_vif [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ae:5c:9d,bridge_name='br-int',has_traffic_filtering=True,id=85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d,network=Network(5aff0e90-d0f3-4716-9f61-98dc8c049108),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap85c6dc8b-d8') {{(pid=71428) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 23 03:58:43 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:43 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:58:43 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 23 03:58:43 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:43 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap85c6dc8b-d8, may_exist=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:58:43 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap85c6dc8b-d8, col_values=(('external_ids', {'iface-id': '85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ae:5c:9d', 'vm-uuid': '27abe5af-8845-40e6-a9c3-12399369b117'}),)) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:58:43 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:43 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:58:43 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:43 user nova-compute[71428]: INFO os_vif [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ae:5c:9d,bridge_name='br-int',has_traffic_filtering=True,id=85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d,network=Network(5aff0e90-d0f3-4716-9f61-98dc8c049108),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap85c6dc8b-d8') Apr 23 03:58:43 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] No BDM found with device name vda, not building metadata. {{(pid=71428) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 23 03:58:43 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] No VIF found with MAC fa:16:3e:ae:5c:9d, not building metadata {{(pid=71428) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 23 03:58:43 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG nova.network.neutron [req-6d097eb4-0646-491a-b596-fe8cd5b1d987 req-5723d892-e272-4888-a97d-8f023baf3682 service nova] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Updated VIF entry in instance network info cache for port 85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d. {{(pid=71428) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG nova.network.neutron [req-6d097eb4-0646-491a-b596-fe8cd5b1d987 req-5723d892-e272-4888-a97d-8f023baf3682 service nova] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Updating instance_info_cache with network_info: [{"id": "85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d", "address": "fa:16:3e:ae:5c:9d", "network": {"id": "5aff0e90-d0f3-4716-9f61-98dc8c049108", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-790345692-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "db4b7bdc478946e2ad0a2060b433b42c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap85c6dc8b-d8", "ovs_interfaceid": "85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-6d097eb4-0646-491a-b596-fe8cd5b1d987 req-5723d892-e272-4888-a97d-8f023baf3682 service nova] Releasing lock "refresh_cache-27abe5af-8845-40e6-a9c3-12399369b117" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG nova.network.neutron [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Successfully updated port: c79b33fd-e966-4247-a5e7-3b6ee20d635c {{(pid=71428) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Acquiring lock "refresh_cache-7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Acquired lock "refresh_cache-7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG nova.network.neutron [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Building network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG nova.network.neutron [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Instance cache missing network info. {{(pid=71428) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG nova.network.neutron [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Updating instance_info_cache with network_info: [{"id": "c79b33fd-e966-4247-a5e7-3b6ee20d635c", "address": "fa:16:3e:04:78:26", "network": {"id": "5aff0e90-d0f3-4716-9f61-98dc8c049108", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-790345692-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "db4b7bdc478946e2ad0a2060b433b42c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc79b33fd-e9", "ovs_interfaceid": "c79b33fd-e966-4247-a5e7-3b6ee20d635c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Releasing lock "refresh_cache-7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG nova.compute.manager [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Instance network_info: |[{"id": "c79b33fd-e966-4247-a5e7-3b6ee20d635c", "address": "fa:16:3e:04:78:26", "network": {"id": "5aff0e90-d0f3-4716-9f61-98dc8c049108", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-790345692-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "db4b7bdc478946e2ad0a2060b433b42c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc79b33fd-e9", "ovs_interfaceid": "c79b33fd-e966-4247-a5e7-3b6ee20d635c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Start _get_guest_xml network_info=[{"id": "c79b33fd-e966-4247-a5e7-3b6ee20d635c", "address": "fa:16:3e:04:78:26", "network": {"id": "5aff0e90-d0f3-4716-9f61-98dc8c049108", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-790345692-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "db4b7bdc478946e2ad0a2060b433b42c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc79b33fd-e9", "ovs_interfaceid": "c79b33fd-e966-4247-a5e7-3b6ee20d635c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'image_id': 'e6127373-9931-4277-9458-eceef653ea1e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:44 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:58:44 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:58:44 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71428) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-23T03:43:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=), allow threads: True {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Flavor limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Image limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Flavor pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Image pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Got 1 possible topologies {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:58:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1609850900',display_name='tempest-ServerRescueNegativeTestJSON-server-1609850900',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-1609850900',id=22,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='db4b7bdc478946e2ad0a2060b433b42c',ramdisk_id='',reservation_id='r-zpb2ujai',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-509507797',owner_user_name='tempest-ServerRescueNegativeTestJSON-509507797-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:58:42Z,user_data=None,user_id='84b2f522c7164c5c934b8bbf113b542a',uuid=7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c79b33fd-e966-4247-a5e7-3b6ee20d635c", "address": "fa:16:3e:04:78:26", "network": {"id": "5aff0e90-d0f3-4716-9f61-98dc8c049108", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-790345692-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "db4b7bdc478946e2ad0a2060b433b42c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc79b33fd-e9", "ovs_interfaceid": "c79b33fd-e966-4247-a5e7-3b6ee20d635c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71428) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Converting VIF {"id": "c79b33fd-e966-4247-a5e7-3b6ee20d635c", "address": "fa:16:3e:04:78:26", "network": {"id": "5aff0e90-d0f3-4716-9f61-98dc8c049108", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-790345692-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "db4b7bdc478946e2ad0a2060b433b42c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc79b33fd-e9", "ovs_interfaceid": "c79b33fd-e966-4247-a5e7-3b6ee20d635c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:78:26,bridge_name='br-int',has_traffic_filtering=True,id=c79b33fd-e966-4247-a5e7-3b6ee20d635c,network=Network(5aff0e90-d0f3-4716-9f61-98dc8c049108),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc79b33fd-e9') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG nova.objects.instance [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lazy-loading 'pci_devices' on Instance uuid 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] End _get_guest_xml xml= Apr 23 03:58:44 user nova-compute[71428]: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23 Apr 23 03:58:44 user nova-compute[71428]: instance-00000016 Apr 23 03:58:44 user nova-compute[71428]: 131072 Apr 23 03:58:44 user nova-compute[71428]: 1 Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: tempest-ServerRescueNegativeTestJSON-server-1609850900 Apr 23 03:58:44 user nova-compute[71428]: 2023-04-23 03:58:44 Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: 128 Apr 23 03:58:44 user nova-compute[71428]: 1 Apr 23 03:58:44 user nova-compute[71428]: 0 Apr 23 03:58:44 user nova-compute[71428]: 0 Apr 23 03:58:44 user nova-compute[71428]: 1 Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: tempest-ServerRescueNegativeTestJSON-509507797-project-member Apr 23 03:58:44 user nova-compute[71428]: tempest-ServerRescueNegativeTestJSON-509507797 Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: OpenStack Foundation Apr 23 03:58:44 user nova-compute[71428]: OpenStack Nova Apr 23 03:58:44 user nova-compute[71428]: 0.0.0 Apr 23 03:58:44 user nova-compute[71428]: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23 Apr 23 03:58:44 user nova-compute[71428]: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23 Apr 23 03:58:44 user nova-compute[71428]: Virtual Machine Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: hvm Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Nehalem Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: /dev/urandom Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: Apr 23 03:58:44 user nova-compute[71428]: {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:58:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1609850900',display_name='tempest-ServerRescueNegativeTestJSON-server-1609850900',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-1609850900',id=22,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='db4b7bdc478946e2ad0a2060b433b42c',ramdisk_id='',reservation_id='r-zpb2ujai',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-509507797',owner_user_name='tempest-ServerRescueNegativeTestJSON-509507797-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:58:42Z,user_data=None,user_id='84b2f522c7164c5c934b8bbf113b542a',uuid=7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c79b33fd-e966-4247-a5e7-3b6ee20d635c", "address": "fa:16:3e:04:78:26", "network": {"id": "5aff0e90-d0f3-4716-9f61-98dc8c049108", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-790345692-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "db4b7bdc478946e2ad0a2060b433b42c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc79b33fd-e9", "ovs_interfaceid": "c79b33fd-e966-4247-a5e7-3b6ee20d635c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Converting VIF {"id": "c79b33fd-e966-4247-a5e7-3b6ee20d635c", "address": "fa:16:3e:04:78:26", "network": {"id": "5aff0e90-d0f3-4716-9f61-98dc8c049108", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-790345692-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "db4b7bdc478946e2ad0a2060b433b42c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc79b33fd-e9", "ovs_interfaceid": "c79b33fd-e966-4247-a5e7-3b6ee20d635c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:78:26,bridge_name='br-int',has_traffic_filtering=True,id=c79b33fd-e966-4247-a5e7-3b6ee20d635c,network=Network(5aff0e90-d0f3-4716-9f61-98dc8c049108),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc79b33fd-e9') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG os_vif [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:78:26,bridge_name='br-int',has_traffic_filtering=True,id=c79b33fd-e966-4247-a5e7-3b6ee20d635c,network=Network(5aff0e90-d0f3-4716-9f61-98dc8c049108),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc79b33fd-e9') {{(pid=71428) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG nova.compute.manager [req-d5a7af4d-37c4-47e5-b432-5cb553ff94fc req-873b43b5-5761-4048-9709-558aaf0c729d service nova] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Received event network-changed-c79b33fd-e966-4247-a5e7-3b6ee20d635c {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG nova.compute.manager [req-d5a7af4d-37c4-47e5-b432-5cb553ff94fc req-873b43b5-5761-4048-9709-558aaf0c729d service nova] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Refreshing instance network info cache due to event network-changed-c79b33fd-e966-4247-a5e7-3b6ee20d635c. {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-d5a7af4d-37c4-47e5-b432-5cb553ff94fc req-873b43b5-5761-4048-9709-558aaf0c729d service nova] Acquiring lock "refresh_cache-7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-d5a7af4d-37c4-47e5-b432-5cb553ff94fc req-873b43b5-5761-4048-9709-558aaf0c729d service nova] Acquired lock "refresh_cache-7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG nova.network.neutron [req-d5a7af4d-37c4-47e5-b432-5cb553ff94fc req-873b43b5-5761-4048-9709-558aaf0c729d service nova] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Refreshing network info cache for port c79b33fd-e966-4247-a5e7-3b6ee20d635c {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc79b33fd-e9, may_exist=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc79b33fd-e9, col_values=(('external_ids', {'iface-id': 'c79b33fd-e966-4247-a5e7-3b6ee20d635c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:04:78:26', 'vm-uuid': '7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23'}),)) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:44 user nova-compute[71428]: INFO os_vif [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:78:26,bridge_name='br-int',has_traffic_filtering=True,id=c79b33fd-e966-4247-a5e7-3b6ee20d635c,network=Network(5aff0e90-d0f3-4716-9f61-98dc8c049108),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc79b33fd-e9') Apr 23 03:58:44 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] No BDM found with device name vda, not building metadata. {{(pid=71428) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 23 03:58:44 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] No VIF found with MAC fa:16:3e:04:78:26, not building metadata {{(pid=71428) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 23 03:58:45 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:45 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:45 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:45 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:45 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:45 user nova-compute[71428]: DEBUG nova.network.neutron [req-d5a7af4d-37c4-47e5-b432-5cb553ff94fc req-873b43b5-5761-4048-9709-558aaf0c729d service nova] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Updated VIF entry in instance network info cache for port c79b33fd-e966-4247-a5e7-3b6ee20d635c. {{(pid=71428) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 23 03:58:45 user nova-compute[71428]: DEBUG nova.network.neutron [req-d5a7af4d-37c4-47e5-b432-5cb553ff94fc req-873b43b5-5761-4048-9709-558aaf0c729d service nova] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Updating instance_info_cache with network_info: [{"id": "c79b33fd-e966-4247-a5e7-3b6ee20d635c", "address": "fa:16:3e:04:78:26", "network": {"id": "5aff0e90-d0f3-4716-9f61-98dc8c049108", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-790345692-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "db4b7bdc478946e2ad0a2060b433b42c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc79b33fd-e9", "ovs_interfaceid": "c79b33fd-e966-4247-a5e7-3b6ee20d635c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:58:45 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-d5a7af4d-37c4-47e5-b432-5cb553ff94fc req-873b43b5-5761-4048-9709-558aaf0c729d service nova] Releasing lock "refresh_cache-7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:58:46 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:46 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:46 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:46 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:46 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:46 user nova-compute[71428]: DEBUG nova.compute.manager [req-feafc0ca-b684-4e9f-8918-eb00570589f1 req-fa23f4c8-1b0b-4008-9c37-9c15435e1459 service nova] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Received event network-vif-plugged-85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:58:46 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-feafc0ca-b684-4e9f-8918-eb00570589f1 req-fa23f4c8-1b0b-4008-9c37-9c15435e1459 service nova] Acquiring lock "27abe5af-8845-40e6-a9c3-12399369b117-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:58:46 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-feafc0ca-b684-4e9f-8918-eb00570589f1 req-fa23f4c8-1b0b-4008-9c37-9c15435e1459 service nova] Lock "27abe5af-8845-40e6-a9c3-12399369b117-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:58:46 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-feafc0ca-b684-4e9f-8918-eb00570589f1 req-fa23f4c8-1b0b-4008-9c37-9c15435e1459 service nova] Lock "27abe5af-8845-40e6-a9c3-12399369b117-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:58:46 user nova-compute[71428]: DEBUG nova.compute.manager [req-feafc0ca-b684-4e9f-8918-eb00570589f1 req-fa23f4c8-1b0b-4008-9c37-9c15435e1459 service nova] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] No waiting events found dispatching network-vif-plugged-85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:58:46 user nova-compute[71428]: WARNING nova.compute.manager [req-feafc0ca-b684-4e9f-8918-eb00570589f1 req-fa23f4c8-1b0b-4008-9c37-9c15435e1459 service nova] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Received unexpected event network-vif-plugged-85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d for instance with vm_state building and task_state spawning. Apr 23 03:58:46 user nova-compute[71428]: DEBUG nova.compute.manager [req-feafc0ca-b684-4e9f-8918-eb00570589f1 req-fa23f4c8-1b0b-4008-9c37-9c15435e1459 service nova] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Received event network-vif-plugged-85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:58:46 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-feafc0ca-b684-4e9f-8918-eb00570589f1 req-fa23f4c8-1b0b-4008-9c37-9c15435e1459 service nova] Acquiring lock "27abe5af-8845-40e6-a9c3-12399369b117-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:58:46 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-feafc0ca-b684-4e9f-8918-eb00570589f1 req-fa23f4c8-1b0b-4008-9c37-9c15435e1459 service nova] Lock "27abe5af-8845-40e6-a9c3-12399369b117-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:58:46 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-feafc0ca-b684-4e9f-8918-eb00570589f1 req-fa23f4c8-1b0b-4008-9c37-9c15435e1459 service nova] Lock "27abe5af-8845-40e6-a9c3-12399369b117-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:58:46 user nova-compute[71428]: DEBUG nova.compute.manager [req-feafc0ca-b684-4e9f-8918-eb00570589f1 req-fa23f4c8-1b0b-4008-9c37-9c15435e1459 service nova] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] No waiting events found dispatching network-vif-plugged-85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:58:46 user nova-compute[71428]: WARNING nova.compute.manager [req-feafc0ca-b684-4e9f-8918-eb00570589f1 req-fa23f4c8-1b0b-4008-9c37-9c15435e1459 service nova] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Received unexpected event network-vif-plugged-85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d for instance with vm_state building and task_state spawning. Apr 23 03:58:46 user nova-compute[71428]: DEBUG nova.compute.manager [req-feafc0ca-b684-4e9f-8918-eb00570589f1 req-fa23f4c8-1b0b-4008-9c37-9c15435e1459 service nova] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Received event network-vif-plugged-c79b33fd-e966-4247-a5e7-3b6ee20d635c {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:58:46 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-feafc0ca-b684-4e9f-8918-eb00570589f1 req-fa23f4c8-1b0b-4008-9c37-9c15435e1459 service nova] Acquiring lock "7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:58:46 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-feafc0ca-b684-4e9f-8918-eb00570589f1 req-fa23f4c8-1b0b-4008-9c37-9c15435e1459 service nova] Lock "7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:58:46 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-feafc0ca-b684-4e9f-8918-eb00570589f1 req-fa23f4c8-1b0b-4008-9c37-9c15435e1459 service nova] Lock "7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:58:46 user nova-compute[71428]: DEBUG nova.compute.manager [req-feafc0ca-b684-4e9f-8918-eb00570589f1 req-fa23f4c8-1b0b-4008-9c37-9c15435e1459 service nova] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] No waiting events found dispatching network-vif-plugged-c79b33fd-e966-4247-a5e7-3b6ee20d635c {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:58:46 user nova-compute[71428]: WARNING nova.compute.manager [req-feafc0ca-b684-4e9f-8918-eb00570589f1 req-fa23f4c8-1b0b-4008-9c37-9c15435e1459 service nova] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Received unexpected event network-vif-plugged-c79b33fd-e966-4247-a5e7-3b6ee20d635c for instance with vm_state building and task_state spawning. Apr 23 03:58:46 user nova-compute[71428]: DEBUG nova.compute.manager [req-feafc0ca-b684-4e9f-8918-eb00570589f1 req-fa23f4c8-1b0b-4008-9c37-9c15435e1459 service nova] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Received event network-vif-plugged-c79b33fd-e966-4247-a5e7-3b6ee20d635c {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 03:58:46 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-feafc0ca-b684-4e9f-8918-eb00570589f1 req-fa23f4c8-1b0b-4008-9c37-9c15435e1459 service nova] Acquiring lock "7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:58:46 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-feafc0ca-b684-4e9f-8918-eb00570589f1 req-fa23f4c8-1b0b-4008-9c37-9c15435e1459 service nova] Lock "7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:58:46 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-feafc0ca-b684-4e9f-8918-eb00570589f1 req-fa23f4c8-1b0b-4008-9c37-9c15435e1459 service nova] Lock "7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:58:46 user nova-compute[71428]: DEBUG nova.compute.manager [req-feafc0ca-b684-4e9f-8918-eb00570589f1 req-fa23f4c8-1b0b-4008-9c37-9c15435e1459 service nova] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] No waiting events found dispatching network-vif-plugged-c79b33fd-e966-4247-a5e7-3b6ee20d635c {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 03:58:46 user nova-compute[71428]: WARNING nova.compute.manager [req-feafc0ca-b684-4e9f-8918-eb00570589f1 req-fa23f4c8-1b0b-4008-9c37-9c15435e1459 service nova] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Received unexpected event network-vif-plugged-c79b33fd-e966-4247-a5e7-3b6ee20d635c for instance with vm_state building and task_state spawning. Apr 23 03:58:47 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Resumed> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:58:47 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] VM Resumed (Lifecycle Event) Apr 23 03:58:47 user nova-compute[71428]: DEBUG nova.compute.manager [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Instance event wait completed in 0 seconds for {{(pid=71428) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 23 03:58:47 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Guest created on hypervisor {{(pid=71428) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 23 03:58:47 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Instance spawned successfully. Apr 23 03:58:47 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 23 03:58:47 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:58:47 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:58:47 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Found default for hw_cdrom_bus of ide {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:58:47 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Found default for hw_disk_bus of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:58:47 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Found default for hw_input_bus of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:58:47 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Found default for hw_pointer_model of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:58:47 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Found default for hw_video_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:58:47 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Found default for hw_vif_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:58:47 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 03:58:47 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Started> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:58:47 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] VM Started (Lifecycle Event) Apr 23 03:58:47 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:58:47 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:58:47 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 03:58:47 user nova-compute[71428]: INFO nova.compute.manager [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Took 7.03 seconds to spawn the instance on the hypervisor. Apr 23 03:58:47 user nova-compute[71428]: DEBUG nova.compute.manager [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:58:47 user nova-compute[71428]: INFO nova.compute.manager [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Took 7.60 seconds to build instance. Apr 23 03:58:47 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-9e940828-61eb-4d16-9eeb-de6764890000 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lock "27abe5af-8845-40e6-a9c3-12399369b117" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.693s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:58:48 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Resumed> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:58:48 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] VM Resumed (Lifecycle Event) Apr 23 03:58:48 user nova-compute[71428]: DEBUG nova.compute.manager [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Instance event wait completed in 0 seconds for {{(pid=71428) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 23 03:58:48 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Guest created on hypervisor {{(pid=71428) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 23 03:58:48 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Instance spawned successfully. Apr 23 03:58:48 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 23 03:58:48 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:58:48 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:58:48 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Found default for hw_cdrom_bus of ide {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:58:48 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Found default for hw_disk_bus of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:58:48 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Found default for hw_input_bus of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:58:48 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Found default for hw_pointer_model of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:58:48 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Found default for hw_video_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:58:48 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Found default for hw_vif_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 03:58:48 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 03:58:48 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Started> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 03:58:48 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] VM Started (Lifecycle Event) Apr 23 03:58:48 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:58:48 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 03:58:48 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 03:58:48 user nova-compute[71428]: INFO nova.compute.manager [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Took 6.66 seconds to spawn the instance on the hypervisor. Apr 23 03:58:48 user nova-compute[71428]: DEBUG nova.compute.manager [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 03:58:48 user nova-compute[71428]: INFO nova.compute.manager [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Took 7.36 seconds to build instance. Apr 23 03:58:48 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-86b9a6ad-34e9-4be8-a8d0-cd291abad83d tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lock "7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.466s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:58:48 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:49 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:53 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:54 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:58:59 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:59:03 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:59:04 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:59:08 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:59:09 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:59:12 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:59:13 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:59:14 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:59:14 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:59:14 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71428) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 23 03:59:17 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:59:18 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:59:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:59:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:59:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:59:18 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Auditing locally available compute resources for user (node: user) {{(pid=71428) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 23 03:59:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:59:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:59:18 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:59:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk --force-share --output=json" returned: 0 in 0.150s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:59:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/27abe5af-8845-40e6-a9c3-12399369b117/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:59:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/27abe5af-8845-40e6-a9c3-12399369b117/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:59:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/27abe5af-8845-40e6-a9c3-12399369b117/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:59:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/27abe5af-8845-40e6-a9c3-12399369b117/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:59:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:59:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:59:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:59:19 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:59:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk --force-share --output=json" returned: 0 in 0.151s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:59:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/994da665-4a5f-41ce-a104-febce8be2557/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:59:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/994da665-4a5f-41ce-a104-febce8be2557/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:59:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/994da665-4a5f-41ce-a104-febce8be2557/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 03:59:20 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/994da665-4a5f-41ce-a104-febce8be2557/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 03:59:20 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:59:20 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 03:59:20 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Hypervisor/Node resource view: name=user free_ram=8736MB free_disk=26.21178436279297GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71428) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 23 03:59:20 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 03:59:20 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 03:59:20 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance ee89a9d5-0507-475c-bea2-e02e34d15710 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:59:20 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 994da665-4a5f-41ce-a104-febce8be2557 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:59:20 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 27abe5af-8845-40e6-a9c3-12399369b117 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:59:20 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 03:59:20 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Total usable vcpus: 12, total allocated vcpus: 4 {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 23 03:59:20 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Final resource view: name=user phys_ram=16023MB used_ram=1024MB phys_disk=40GB used_disk=4GB total_vcpus=12 used_vcpus=4 pci_stats=[] {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 23 03:59:20 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 03:59:20 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 03:59:20 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Compute_service record updated for user:user {{(pid=71428) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 23 03:59:20 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.293s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 03:59:21 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:59:23 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:59:23 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Starting heal instance info cache {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 23 03:59:23 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Rebuilding the list of instances to heal {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 23 03:59:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "refresh_cache-ee89a9d5-0507-475c-bea2-e02e34d15710" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 03:59:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquired lock "refresh_cache-ee89a9d5-0507-475c-bea2-e02e34d15710" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 03:59:23 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Forcefully refreshing network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 23 03:59:23 user nova-compute[71428]: DEBUG nova.objects.instance [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lazy-loading 'info_cache' on Instance uuid ee89a9d5-0507-475c-bea2-e02e34d15710 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 03:59:24 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Updating instance_info_cache with network_info: [{"id": "8948ed3a-de42-41ef-b7ca-c34a2d405b09", "address": "fa:16:3e:d2:42:a8", "network": {"id": "083ef998-5361-46b7-827f-d0f086e9e552", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1112168066-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "39d6cb4cfcb2413f8039ee9febf1c046", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8948ed3a-de", "ovs_interfaceid": "8948ed3a-de42-41ef-b7ca-c34a2d405b09", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 03:59:24 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Releasing lock "refresh_cache-ee89a9d5-0507-475c-bea2-e02e34d15710" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 03:59:24 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Updated the network info_cache for instance {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 23 03:59:24 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:59:24 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:59:24 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:59:25 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 03:59:28 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:59:29 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:59:33 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:59:34 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:59:39 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:59:39 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:59:39 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71428) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 23 03:59:39 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 03:59:39 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 03:59:39 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:59:44 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:59:44 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:59:44 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71428) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 23 03:59:44 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 03:59:44 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 03:59:44 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:59:49 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:59:54 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 03:59:54 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:59:54 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71428) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 23 03:59:54 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 03:59:54 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 03:59:54 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:59:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 03:59:59 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:00:02 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-4c42b6aa-59d4-4d14-b7c5-040e6b42b7c9 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Acquiring lock "994da665-4a5f-41ce-a104-febce8be2557" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:00:02 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-4c42b6aa-59d4-4d14-b7c5-040e6b42b7c9 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lock "994da665-4a5f-41ce-a104-febce8be2557" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:00:02 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-4c42b6aa-59d4-4d14-b7c5-040e6b42b7c9 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Acquiring lock "994da665-4a5f-41ce-a104-febce8be2557-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:00:02 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-4c42b6aa-59d4-4d14-b7c5-040e6b42b7c9 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lock "994da665-4a5f-41ce-a104-febce8be2557-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:00:02 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-4c42b6aa-59d4-4d14-b7c5-040e6b42b7c9 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lock "994da665-4a5f-41ce-a104-febce8be2557-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:00:02 user nova-compute[71428]: INFO nova.compute.manager [None req-4c42b6aa-59d4-4d14-b7c5-040e6b42b7c9 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Terminating instance Apr 23 04:00:02 user nova-compute[71428]: DEBUG nova.compute.manager [None req-4c42b6aa-59d4-4d14-b7c5-040e6b42b7c9 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Start destroying the instance on the hypervisor. {{(pid=71428) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 23 04:00:02 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:00:02 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:00:02 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:00:02 user nova-compute[71428]: DEBUG nova.compute.manager [req-f72379ae-f69c-4925-93dd-8d4fb5dec352 req-022c9255-5510-413f-9c18-dc1df5e83281 service nova] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Received event network-vif-unplugged-7f5011fe-2339-41ec-b6db-535375a33a9d {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:00:02 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-f72379ae-f69c-4925-93dd-8d4fb5dec352 req-022c9255-5510-413f-9c18-dc1df5e83281 service nova] Acquiring lock "994da665-4a5f-41ce-a104-febce8be2557-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:00:02 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-f72379ae-f69c-4925-93dd-8d4fb5dec352 req-022c9255-5510-413f-9c18-dc1df5e83281 service nova] Lock "994da665-4a5f-41ce-a104-febce8be2557-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:00:02 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-f72379ae-f69c-4925-93dd-8d4fb5dec352 req-022c9255-5510-413f-9c18-dc1df5e83281 service nova] Lock "994da665-4a5f-41ce-a104-febce8be2557-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:00:02 user nova-compute[71428]: DEBUG nova.compute.manager [req-f72379ae-f69c-4925-93dd-8d4fb5dec352 req-022c9255-5510-413f-9c18-dc1df5e83281 service nova] [instance: 994da665-4a5f-41ce-a104-febce8be2557] No waiting events found dispatching network-vif-unplugged-7f5011fe-2339-41ec-b6db-535375a33a9d {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 04:00:02 user nova-compute[71428]: DEBUG nova.compute.manager [req-f72379ae-f69c-4925-93dd-8d4fb5dec352 req-022c9255-5510-413f-9c18-dc1df5e83281 service nova] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Received event network-vif-unplugged-7f5011fe-2339-41ec-b6db-535375a33a9d for instance with task_state deleting. {{(pid=71428) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 23 04:00:02 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:00:03 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Instance destroyed successfully. Apr 23 04:00:03 user nova-compute[71428]: DEBUG nova.objects.instance [None req-4c42b6aa-59d4-4d14-b7c5-040e6b42b7c9 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lazy-loading 'resources' on Instance uuid 994da665-4a5f-41ce-a104-febce8be2557 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 04:00:03 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-4c42b6aa-59d4-4d14-b7c5-040e6b42b7c9 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:58:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1132169926',display_name='tempest-ServersNegativeTestJSON-server-1132169926',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1132169926',id=20,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-23T03:58:18Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='39d6cb4cfcb2413f8039ee9febf1c046',ramdisk_id='',reservation_id='r-n7idxnsu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServersNegativeTestJSON-1115615559',owner_user_name='tempest-ServersNegativeTestJSON-1115615559-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-23T03:58:18Z,user_data=None,user_id='45d9ae4a3b00492fa2fc16a22a34df09',uuid=994da665-4a5f-41ce-a104-febce8be2557,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7f5011fe-2339-41ec-b6db-535375a33a9d", "address": "fa:16:3e:8e:ca:f0", "network": {"id": "083ef998-5361-46b7-827f-d0f086e9e552", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1112168066-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "39d6cb4cfcb2413f8039ee9febf1c046", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f5011fe-23", "ovs_interfaceid": "7f5011fe-2339-41ec-b6db-535375a33a9d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 23 04:00:03 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-4c42b6aa-59d4-4d14-b7c5-040e6b42b7c9 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Converting VIF {"id": "7f5011fe-2339-41ec-b6db-535375a33a9d", "address": "fa:16:3e:8e:ca:f0", "network": {"id": "083ef998-5361-46b7-827f-d0f086e9e552", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1112168066-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "39d6cb4cfcb2413f8039ee9febf1c046", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f5011fe-23", "ovs_interfaceid": "7f5011fe-2339-41ec-b6db-535375a33a9d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 04:00:03 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-4c42b6aa-59d4-4d14-b7c5-040e6b42b7c9 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8e:ca:f0,bridge_name='br-int',has_traffic_filtering=True,id=7f5011fe-2339-41ec-b6db-535375a33a9d,network=Network(083ef998-5361-46b7-827f-d0f086e9e552),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f5011fe-23') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 04:00:03 user nova-compute[71428]: DEBUG os_vif [None req-4c42b6aa-59d4-4d14-b7c5-040e6b42b7c9 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8e:ca:f0,bridge_name='br-int',has_traffic_filtering=True,id=7f5011fe-2339-41ec-b6db-535375a33a9d,network=Network(083ef998-5361-46b7-827f-d0f086e9e552),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f5011fe-23') {{(pid=71428) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 23 04:00:03 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:00:03 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7f5011fe-23, bridge=br-int, if_exists=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 04:00:03 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:00:03 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:00:03 user nova-compute[71428]: INFO os_vif [None req-4c42b6aa-59d4-4d14-b7c5-040e6b42b7c9 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8e:ca:f0,bridge_name='br-int',has_traffic_filtering=True,id=7f5011fe-2339-41ec-b6db-535375a33a9d,network=Network(083ef998-5361-46b7-827f-d0f086e9e552),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f5011fe-23') Apr 23 04:00:03 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-4c42b6aa-59d4-4d14-b7c5-040e6b42b7c9 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Deleting instance files /opt/stack/data/nova/instances/994da665-4a5f-41ce-a104-febce8be2557_del Apr 23 04:00:03 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-4c42b6aa-59d4-4d14-b7c5-040e6b42b7c9 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Deletion of /opt/stack/data/nova/instances/994da665-4a5f-41ce-a104-febce8be2557_del complete Apr 23 04:00:03 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:00:03 user nova-compute[71428]: INFO nova.compute.manager [None req-4c42b6aa-59d4-4d14-b7c5-040e6b42b7c9 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Took 0.74 seconds to destroy the instance on the hypervisor. Apr 23 04:00:03 user nova-compute[71428]: DEBUG oslo.service.loopingcall [None req-4c42b6aa-59d4-4d14-b7c5-040e6b42b7c9 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71428) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 23 04:00:03 user nova-compute[71428]: DEBUG nova.compute.manager [-] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Deallocating network for instance {{(pid=71428) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 23 04:00:03 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: 994da665-4a5f-41ce-a104-febce8be2557] deallocate_for_instance() {{(pid=71428) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 23 04:00:03 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Updating instance_info_cache with network_info: [] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 04:00:03 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Took 0.49 seconds to deallocate network for instance. Apr 23 04:00:03 user nova-compute[71428]: DEBUG nova.compute.manager [req-e3b61cc2-896e-4ad9-a2c3-ee7c9487bb19 req-e4471c33-6b15-495c-b621-1394176b0837 service nova] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Received event network-vif-deleted-7f5011fe-2339-41ec-b6db-535375a33a9d {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:00:03 user nova-compute[71428]: INFO nova.compute.manager [req-e3b61cc2-896e-4ad9-a2c3-ee7c9487bb19 req-e4471c33-6b15-495c-b621-1394176b0837 service nova] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Neutron deleted interface 7f5011fe-2339-41ec-b6db-535375a33a9d; detaching it from the instance and deleting it from the info cache Apr 23 04:00:03 user nova-compute[71428]: DEBUG nova.network.neutron [req-e3b61cc2-896e-4ad9-a2c3-ee7c9487bb19 req-e4471c33-6b15-495c-b621-1394176b0837 service nova] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Updating instance_info_cache with network_info: [] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 04:00:03 user nova-compute[71428]: DEBUG nova.compute.manager [req-e3b61cc2-896e-4ad9-a2c3-ee7c9487bb19 req-e4471c33-6b15-495c-b621-1394176b0837 service nova] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Detach interface failed, port_id=7f5011fe-2339-41ec-b6db-535375a33a9d, reason: Instance 994da665-4a5f-41ce-a104-febce8be2557 could not be found. {{(pid=71428) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 23 04:00:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-4c42b6aa-59d4-4d14-b7c5-040e6b42b7c9 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:00:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-4c42b6aa-59d4-4d14-b7c5-040e6b42b7c9 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:00:03 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-4c42b6aa-59d4-4d14-b7c5-040e6b42b7c9 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 04:00:03 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-4c42b6aa-59d4-4d14-b7c5-040e6b42b7c9 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 04:00:03 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-4c42b6aa-59d4-4d14-b7c5-040e6b42b7c9 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.171s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:00:03 user nova-compute[71428]: INFO nova.scheduler.client.report [None req-4c42b6aa-59d4-4d14-b7c5-040e6b42b7c9 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Deleted allocations for instance 994da665-4a5f-41ce-a104-febce8be2557 Apr 23 04:00:04 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-4c42b6aa-59d4-4d14-b7c5-040e6b42b7c9 tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lock "994da665-4a5f-41ce-a104-febce8be2557" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.578s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:00:04 user nova-compute[71428]: DEBUG nova.compute.manager [req-10bb6a6f-4835-40ec-a50a-6fa8fe85bf37 req-f94f2ea2-7c8f-48be-a909-ec0003ddd1a9 service nova] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Received event network-vif-plugged-7f5011fe-2339-41ec-b6db-535375a33a9d {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:00:04 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-10bb6a6f-4835-40ec-a50a-6fa8fe85bf37 req-f94f2ea2-7c8f-48be-a909-ec0003ddd1a9 service nova] Acquiring lock "994da665-4a5f-41ce-a104-febce8be2557-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:00:04 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-10bb6a6f-4835-40ec-a50a-6fa8fe85bf37 req-f94f2ea2-7c8f-48be-a909-ec0003ddd1a9 service nova] Lock "994da665-4a5f-41ce-a104-febce8be2557-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:00:04 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-10bb6a6f-4835-40ec-a50a-6fa8fe85bf37 req-f94f2ea2-7c8f-48be-a909-ec0003ddd1a9 service nova] Lock "994da665-4a5f-41ce-a104-febce8be2557-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:00:04 user nova-compute[71428]: DEBUG nova.compute.manager [req-10bb6a6f-4835-40ec-a50a-6fa8fe85bf37 req-f94f2ea2-7c8f-48be-a909-ec0003ddd1a9 service nova] [instance: 994da665-4a5f-41ce-a104-febce8be2557] No waiting events found dispatching network-vif-plugged-7f5011fe-2339-41ec-b6db-535375a33a9d {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 04:00:04 user nova-compute[71428]: WARNING nova.compute.manager [req-10bb6a6f-4835-40ec-a50a-6fa8fe85bf37 req-f94f2ea2-7c8f-48be-a909-ec0003ddd1a9 service nova] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Received unexpected event network-vif-plugged-7f5011fe-2339-41ec-b6db-535375a33a9d for instance with vm_state deleted and task_state None. Apr 23 04:00:08 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:00:13 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:00:13 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:00:15 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:00:15 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71428) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 23 04:00:17 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:00:18 user nova-compute[71428]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 04:00:18 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: 994da665-4a5f-41ce-a104-febce8be2557] VM Stopped (Lifecycle Event) Apr 23 04:00:18 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e71b0627-1004-4c64-851c-b15e94ca3241 None None] [instance: 994da665-4a5f-41ce-a104-febce8be2557] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 04:00:18 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:00:19 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:00:19 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:00:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:00:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:00:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:00:19 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Auditing locally available compute resources for user (node: user) {{(pid=71428) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 23 04:00:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:00:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:00:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:00:20 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:00:20 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/27abe5af-8845-40e6-a9c3-12399369b117/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:00:20 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/27abe5af-8845-40e6-a9c3-12399369b117/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:00:20 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/27abe5af-8845-40e6-a9c3-12399369b117/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:00:20 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/27abe5af-8845-40e6-a9c3-12399369b117/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:00:20 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:00:20 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:00:20 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:00:20 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:00:21 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 04:00:21 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 04:00:21 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Hypervisor/Node resource view: name=user free_ram=8855MB free_disk=26.230655670166016GB free_vcpus=9 pci_devices=[{"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71428) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 23 04:00:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:00:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:00:21 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance ee89a9d5-0507-475c-bea2-e02e34d15710 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 04:00:21 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 27abe5af-8845-40e6-a9c3-12399369b117 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 04:00:21 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 04:00:21 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Total usable vcpus: 12, total allocated vcpus: 3 {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 23 04:00:21 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Final resource view: name=user phys_ram=16023MB used_ram=896MB phys_disk=40GB used_disk=3GB total_vcpus=12 used_vcpus=3 pci_stats=[] {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 23 04:00:21 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 04:00:21 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 04:00:21 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Compute_service record updated for user:user {{(pid=71428) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 23 04:00:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.280s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:00:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:00:25 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:00:25 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Starting heal instance info cache {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 23 04:00:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "refresh_cache-27abe5af-8845-40e6-a9c3-12399369b117" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 04:00:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquired lock "refresh_cache-27abe5af-8845-40e6-a9c3-12399369b117" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 04:00:25 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Forcefully refreshing network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 23 04:00:25 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Updating instance_info_cache with network_info: [{"id": "85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d", "address": "fa:16:3e:ae:5c:9d", "network": {"id": "5aff0e90-d0f3-4716-9f61-98dc8c049108", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-790345692-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "db4b7bdc478946e2ad0a2060b433b42c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap85c6dc8b-d8", "ovs_interfaceid": "85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 04:00:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Releasing lock "refresh_cache-27abe5af-8845-40e6-a9c3-12399369b117" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 04:00:25 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Updated the network info_cache for instance {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 23 04:00:25 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:00:25 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:00:27 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:00:27 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:00:28 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:00:31 user nova-compute[71428]: INFO nova.compute.manager [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Rescuing Apr 23 04:00:31 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Acquiring lock "refresh_cache-7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 04:00:31 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Acquired lock "refresh_cache-7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 04:00:31 user nova-compute[71428]: DEBUG nova.network.neutron [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Building network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 23 04:00:31 user nova-compute[71428]: DEBUG nova.network.neutron [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Updating instance_info_cache with network_info: [{"id": "c79b33fd-e966-4247-a5e7-3b6ee20d635c", "address": "fa:16:3e:04:78:26", "network": {"id": "5aff0e90-d0f3-4716-9f61-98dc8c049108", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-790345692-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "db4b7bdc478946e2ad0a2060b433b42c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc79b33fd-e9", "ovs_interfaceid": "c79b33fd-e966-4247-a5e7-3b6ee20d635c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 04:00:31 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Releasing lock "refresh_cache-7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 04:00:31 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG nova.compute.manager [req-eb7f6342-1844-4f92-b064-5b59f7784ced req-f9df2113-fdca-4e03-a6e8-c31c9279d5bb service nova] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Received event network-vif-unplugged-c79b33fd-e966-4247-a5e7-3b6ee20d635c {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-eb7f6342-1844-4f92-b064-5b59f7784ced req-f9df2113-fdca-4e03-a6e8-c31c9279d5bb service nova] Acquiring lock "7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-eb7f6342-1844-4f92-b064-5b59f7784ced req-f9df2113-fdca-4e03-a6e8-c31c9279d5bb service nova] Lock "7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-eb7f6342-1844-4f92-b064-5b59f7784ced req-f9df2113-fdca-4e03-a6e8-c31c9279d5bb service nova] Lock "7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG nova.compute.manager [req-eb7f6342-1844-4f92-b064-5b59f7784ced req-f9df2113-fdca-4e03-a6e8-c31c9279d5bb service nova] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] No waiting events found dispatching network-vif-unplugged-c79b33fd-e966-4247-a5e7-3b6ee20d635c {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 04:00:32 user nova-compute[71428]: WARNING nova.compute.manager [req-eb7f6342-1844-4f92-b064-5b59f7784ced req-f9df2113-fdca-4e03-a6e8-c31c9279d5bb service nova] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Received unexpected event network-vif-unplugged-c79b33fd-e966-4247-a5e7-3b6ee20d635c for instance with vm_state active and task_state rescuing. Apr 23 04:00:32 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:00:32 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Instance destroyed successfully. Apr 23 04:00:32 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Attempting rescue Apr 23 04:00:32 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} {{(pid=71428) rescue /opt/stack/nova/nova/virt/libvirt/driver.py:4289}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Instance directory exists: not creating {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4694}} Apr 23 04:00:32 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Creating image(s) Apr 23 04:00:32 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Acquiring lock "/opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lock "/opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lock "/opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG nova.objects.instance [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lazy-loading 'trusted_certs' on Instance uuid 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Acquiring lock "935dc3474dddbde110531a31510941caddc0ae83" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.127s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk.rescue {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk.rescue" returned: 0 in 0.056s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.189s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG nova.objects.instance [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lazy-loading 'migration_context' on Instance uuid 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Created local disks {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Start _get_guest_xml network_info=[{"id": "c79b33fd-e966-4247-a5e7-3b6ee20d635c", "address": "fa:16:3e:04:78:26", "network": {"id": "5aff0e90-d0f3-4716-9f61-98dc8c049108", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-790345692-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-790345692-network", "vif_mac": "fa:16:3e:04:78:26"}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "db4b7bdc478946e2ad0a2060b433b42c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc79b33fd-e9", "ovs_interfaceid": "c79b33fd-e966-4247-a5e7-3b6ee20d635c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=) rescue={'image_id': 'e6127373-9931-4277-9458-eceef653ea1e', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG nova.objects.instance [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lazy-loading 'resources' on Instance uuid 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG nova.objects.instance [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lazy-loading 'numa_topology' on Instance uuid 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 04:00:32 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 04:00:32 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 04:00:32 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71428) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-23T03:43:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=), allow threads: True {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Flavor limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Image limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Flavor pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Image pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Got 1 possible topologies {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG nova.objects.instance [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lazy-loading 'vcpu_model' on Instance uuid 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:58:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1609850900',display_name='tempest-ServerRescueNegativeTestJSON-server-1609850900',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-1609850900',id=22,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-23T03:58:48Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='db4b7bdc478946e2ad0a2060b433b42c',ramdisk_id='',reservation_id='r-zpb2ujai',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerRescueNegativeTestJSON-509507797',owner_user_name='tempest-ServerRescueNegativeTestJSON-509507797-project-member'},tags=,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T03:58:48Z,user_data=None,user_id='84b2f522c7164c5c934b8bbf113b542a',uuid=7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c79b33fd-e966-4247-a5e7-3b6ee20d635c", "address": "fa:16:3e:04:78:26", "network": {"id": "5aff0e90-d0f3-4716-9f61-98dc8c049108", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-790345692-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-790345692-network", "vif_mac": "fa:16:3e:04:78:26"}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "db4b7bdc478946e2ad0a2060b433b42c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc79b33fd-e9", "ovs_interfaceid": "c79b33fd-e966-4247-a5e7-3b6ee20d635c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71428) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Converting VIF {"id": "c79b33fd-e966-4247-a5e7-3b6ee20d635c", "address": "fa:16:3e:04:78:26", "network": {"id": "5aff0e90-d0f3-4716-9f61-98dc8c049108", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-790345692-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-790345692-network", "vif_mac": "fa:16:3e:04:78:26"}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "db4b7bdc478946e2ad0a2060b433b42c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc79b33fd-e9", "ovs_interfaceid": "c79b33fd-e966-4247-a5e7-3b6ee20d635c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:04:78:26,bridge_name='br-int',has_traffic_filtering=True,id=c79b33fd-e966-4247-a5e7-3b6ee20d635c,network=Network(5aff0e90-d0f3-4716-9f61-98dc8c049108),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc79b33fd-e9') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG nova.objects.instance [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lazy-loading 'pci_devices' on Instance uuid 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] End _get_guest_xml xml= Apr 23 04:00:32 user nova-compute[71428]: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23 Apr 23 04:00:32 user nova-compute[71428]: instance-00000016 Apr 23 04:00:32 user nova-compute[71428]: 131072 Apr 23 04:00:32 user nova-compute[71428]: 1 Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: tempest-ServerRescueNegativeTestJSON-server-1609850900 Apr 23 04:00:32 user nova-compute[71428]: 2023-04-23 04:00:32 Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: 128 Apr 23 04:00:32 user nova-compute[71428]: 1 Apr 23 04:00:32 user nova-compute[71428]: 0 Apr 23 04:00:32 user nova-compute[71428]: 0 Apr 23 04:00:32 user nova-compute[71428]: 1 Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: tempest-ServerRescueNegativeTestJSON-509507797-project-member Apr 23 04:00:32 user nova-compute[71428]: tempest-ServerRescueNegativeTestJSON-509507797 Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: OpenStack Foundation Apr 23 04:00:32 user nova-compute[71428]: OpenStack Nova Apr 23 04:00:32 user nova-compute[71428]: 0.0.0 Apr 23 04:00:32 user nova-compute[71428]: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23 Apr 23 04:00:32 user nova-compute[71428]: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23 Apr 23 04:00:32 user nova-compute[71428]: Virtual Machine Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: hvm Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Nehalem Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: /dev/urandom Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: Apr 23 04:00:32 user nova-compute[71428]: {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 23 04:00:32 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Instance destroyed successfully. Apr 23 04:00:32 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] No BDM found with device name vda, not building metadata. {{(pid=71428) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] No BDM found with device name vdb, not building metadata. {{(pid=71428) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 23 04:00:32 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] No VIF found with MAC fa:16:3e:04:78:26, not building metadata {{(pid=71428) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 23 04:00:33 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:00:33 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:00:33 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:00:33 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:00:34 user nova-compute[71428]: DEBUG nova.compute.manager [req-bb813bb9-dd53-4b64-9be3-ebe252942470 req-db000270-ac57-4f4c-b929-d26cb25678fe service nova] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Received event network-vif-plugged-c79b33fd-e966-4247-a5e7-3b6ee20d635c {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:00:34 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-bb813bb9-dd53-4b64-9be3-ebe252942470 req-db000270-ac57-4f4c-b929-d26cb25678fe service nova] Acquiring lock "7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:00:34 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-bb813bb9-dd53-4b64-9be3-ebe252942470 req-db000270-ac57-4f4c-b929-d26cb25678fe service nova] Lock "7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:00:34 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-bb813bb9-dd53-4b64-9be3-ebe252942470 req-db000270-ac57-4f4c-b929-d26cb25678fe service nova] Lock "7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:00:34 user nova-compute[71428]: DEBUG nova.compute.manager [req-bb813bb9-dd53-4b64-9be3-ebe252942470 req-db000270-ac57-4f4c-b929-d26cb25678fe service nova] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] No waiting events found dispatching network-vif-plugged-c79b33fd-e966-4247-a5e7-3b6ee20d635c {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 04:00:34 user nova-compute[71428]: WARNING nova.compute.manager [req-bb813bb9-dd53-4b64-9be3-ebe252942470 req-db000270-ac57-4f4c-b929-d26cb25678fe service nova] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Received unexpected event network-vif-plugged-c79b33fd-e966-4247-a5e7-3b6ee20d635c for instance with vm_state active and task_state rescuing. Apr 23 04:00:34 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:00:34 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:00:36 user nova-compute[71428]: DEBUG nova.virt.libvirt.host [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Removed pending event for 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23 due to event {{(pid=71428) _event_emit_delayed /opt/stack/nova/nova/virt/libvirt/host.py:438}} Apr 23 04:00:36 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Resumed> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 04:00:36 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] VM Resumed (Lifecycle Event) Apr 23 04:00:36 user nova-compute[71428]: DEBUG nova.compute.manager [None req-813f3336-1227-462e-bd19-550c1f314b09 tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 04:00:36 user nova-compute[71428]: DEBUG nova.compute.manager [req-20f7ae2f-e649-471f-9592-f00a6d3f00c8 req-59500b07-e64a-46e7-a58f-c7d85bd5757f service nova] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Received event network-vif-plugged-c79b33fd-e966-4247-a5e7-3b6ee20d635c {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:00:36 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-20f7ae2f-e649-471f-9592-f00a6d3f00c8 req-59500b07-e64a-46e7-a58f-c7d85bd5757f service nova] Acquiring lock "7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:00:36 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-20f7ae2f-e649-471f-9592-f00a6d3f00c8 req-59500b07-e64a-46e7-a58f-c7d85bd5757f service nova] Lock "7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:00:36 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-20f7ae2f-e649-471f-9592-f00a6d3f00c8 req-59500b07-e64a-46e7-a58f-c7d85bd5757f service nova] Lock "7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:00:36 user nova-compute[71428]: DEBUG nova.compute.manager [req-20f7ae2f-e649-471f-9592-f00a6d3f00c8 req-59500b07-e64a-46e7-a58f-c7d85bd5757f service nova] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] No waiting events found dispatching network-vif-plugged-c79b33fd-e966-4247-a5e7-3b6ee20d635c {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 04:00:36 user nova-compute[71428]: WARNING nova.compute.manager [req-20f7ae2f-e649-471f-9592-f00a6d3f00c8 req-59500b07-e64a-46e7-a58f-c7d85bd5757f service nova] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Received unexpected event network-vif-plugged-c79b33fd-e966-4247-a5e7-3b6ee20d635c for instance with vm_state active and task_state rescuing. Apr 23 04:00:36 user nova-compute[71428]: DEBUG nova.compute.manager [req-20f7ae2f-e649-471f-9592-f00a6d3f00c8 req-59500b07-e64a-46e7-a58f-c7d85bd5757f service nova] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Received event network-vif-plugged-c79b33fd-e966-4247-a5e7-3b6ee20d635c {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:00:36 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-20f7ae2f-e649-471f-9592-f00a6d3f00c8 req-59500b07-e64a-46e7-a58f-c7d85bd5757f service nova] Acquiring lock "7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:00:36 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-20f7ae2f-e649-471f-9592-f00a6d3f00c8 req-59500b07-e64a-46e7-a58f-c7d85bd5757f service nova] Lock "7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:00:36 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-20f7ae2f-e649-471f-9592-f00a6d3f00c8 req-59500b07-e64a-46e7-a58f-c7d85bd5757f service nova] Lock "7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:00:36 user nova-compute[71428]: DEBUG nova.compute.manager [req-20f7ae2f-e649-471f-9592-f00a6d3f00c8 req-59500b07-e64a-46e7-a58f-c7d85bd5757f service nova] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] No waiting events found dispatching network-vif-plugged-c79b33fd-e966-4247-a5e7-3b6ee20d635c {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 04:00:36 user nova-compute[71428]: WARNING nova.compute.manager [req-20f7ae2f-e649-471f-9592-f00a6d3f00c8 req-59500b07-e64a-46e7-a58f-c7d85bd5757f service nova] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Received unexpected event network-vif-plugged-c79b33fd-e966-4247-a5e7-3b6ee20d635c for instance with vm_state active and task_state rescuing. Apr 23 04:00:36 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 04:00:36 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 04:00:36 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] During sync_power_state the instance has a pending task (rescuing). Skip. Apr 23 04:00:36 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Started> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 04:00:36 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] VM Started (Lifecycle Event) Apr 23 04:00:36 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 04:00:36 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 04:00:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:00:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:00:43 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:00:48 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:00:48 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:00:53 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:00:53 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:00:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:00:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:01:03 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:01:03 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:01:08 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:01:13 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:01:13 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:01:17 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:01:17 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71428) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 23 04:01:18 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:01:19 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:01:19 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:01:20 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:01:20 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:01:20 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:01:20 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:01:20 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Auditing locally available compute resources for user (node: user) {{(pid=71428) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 23 04:01:20 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk.rescue --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:01:20 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk.rescue --force-share --output=json" returned: 0 in 0.129s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:01:20 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk.rescue --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:01:21 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk.rescue --force-share --output=json" returned: 0 in 0.132s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:01:21 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:01:21 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:01:21 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:01:21 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:01:21 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/27abe5af-8845-40e6-a9c3-12399369b117/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:01:21 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/27abe5af-8845-40e6-a9c3-12399369b117/disk --force-share --output=json" returned: 0 in 0.145s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:01:21 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/27abe5af-8845-40e6-a9c3-12399369b117/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:01:21 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/27abe5af-8845-40e6-a9c3-12399369b117/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:01:21 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:01:21 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk --force-share --output=json" returned: 0 in 0.126s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:01:21 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:01:21 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:01:22 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 04:01:22 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 04:01:22 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Hypervisor/Node resource view: name=user free_ram=8918MB free_disk=26.21036148071289GB free_vcpus=9 pci_devices=[{"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71428) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 23 04:01:22 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:01:22 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:01:22 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance ee89a9d5-0507-475c-bea2-e02e34d15710 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 04:01:22 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 27abe5af-8845-40e6-a9c3-12399369b117 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 04:01:22 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 04:01:22 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Total usable vcpus: 12, total allocated vcpus: 3 {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 23 04:01:22 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Final resource view: name=user phys_ram=16023MB used_ram=896MB phys_disk=40GB used_disk=3GB total_vcpus=12 used_vcpus=3 pci_stats=[] {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 23 04:01:22 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 04:01:22 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 04:01:22 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Compute_service record updated for user:user {{(pid=71428) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 23 04:01:22 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.298s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:01:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:01:23 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:01:24 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:01:24 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Starting heal instance info cache {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 23 04:01:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "refresh_cache-7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 04:01:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquired lock "refresh_cache-7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 04:01:25 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Forcefully refreshing network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 23 04:01:25 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Updating instance_info_cache with network_info: [{"id": "c79b33fd-e966-4247-a5e7-3b6ee20d635c", "address": "fa:16:3e:04:78:26", "network": {"id": "5aff0e90-d0f3-4716-9f61-98dc8c049108", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-790345692-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "db4b7bdc478946e2ad0a2060b433b42c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc79b33fd-e9", "ovs_interfaceid": "c79b33fd-e966-4247-a5e7-3b6ee20d635c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 04:01:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Releasing lock "refresh_cache-7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 04:01:25 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Updated the network info_cache for instance {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 23 04:01:25 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:01:27 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:01:28 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:01:29 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:01:33 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:01:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:01:43 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:01:48 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:01:53 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:01:54 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:01:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:02:03 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:02:08 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:02:13 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:02:14 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:02:15 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:02:15 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Cleaning up deleted instances with incomplete migration {{(pid=71428) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 23 04:02:16 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:02:17 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:02:17 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71428) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 23 04:02:18 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:02:19 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:02:21 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:02:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:02:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:02:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:02:21 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Auditing locally available compute resources for user (node: user) {{(pid=71428) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 23 04:02:21 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk.rescue --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:02:21 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk.rescue --force-share --output=json" returned: 0 in 0.132s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:02:21 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk.rescue --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:02:22 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk.rescue --force-share --output=json" returned: 0 in 0.132s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:02:22 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:02:22 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:02:22 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:02:22 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:02:22 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/27abe5af-8845-40e6-a9c3-12399369b117/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:02:22 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/27abe5af-8845-40e6-a9c3-12399369b117/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:02:22 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/27abe5af-8845-40e6-a9c3-12399369b117/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:02:22 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/27abe5af-8845-40e6-a9c3-12399369b117/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:02:22 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:02:22 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:02:22 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:02:22 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:02:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:02:23 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 04:02:23 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 04:02:23 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Hypervisor/Node resource view: name=user free_ram=8910MB free_disk=26.20970916748047GB free_vcpus=9 pci_devices=[{"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71428) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 23 04:02:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:02:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:02:23 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance ee89a9d5-0507-475c-bea2-e02e34d15710 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 04:02:23 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 27abe5af-8845-40e6-a9c3-12399369b117 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 04:02:23 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 04:02:23 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Total usable vcpus: 12, total allocated vcpus: 3 {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 23 04:02:23 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Final resource view: name=user phys_ram=16023MB used_ram=896MB phys_disk=40GB used_disk=3GB total_vcpus=12 used_vcpus=3 pci_stats=[] {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 23 04:02:23 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 04:02:23 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 04:02:23 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Compute_service record updated for user:user {{(pid=71428) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 23 04:02:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.393s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:02:25 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:02:25 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Starting heal instance info cache {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 23 04:02:25 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Rebuilding the list of instances to heal {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 23 04:02:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "refresh_cache-ee89a9d5-0507-475c-bea2-e02e34d15710" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 04:02:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquired lock "refresh_cache-ee89a9d5-0507-475c-bea2-e02e34d15710" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 04:02:25 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Forcefully refreshing network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 23 04:02:25 user nova-compute[71428]: DEBUG nova.objects.instance [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lazy-loading 'info_cache' on Instance uuid ee89a9d5-0507-475c-bea2-e02e34d15710 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 04:02:26 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Updating instance_info_cache with network_info: [{"id": "8948ed3a-de42-41ef-b7ca-c34a2d405b09", "address": "fa:16:3e:d2:42:a8", "network": {"id": "083ef998-5361-46b7-827f-d0f086e9e552", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1112168066-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "39d6cb4cfcb2413f8039ee9febf1c046", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8948ed3a-de", "ovs_interfaceid": "8948ed3a-de42-41ef-b7ca-c34a2d405b09", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 04:02:26 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Releasing lock "refresh_cache-ee89a9d5-0507-475c-bea2-e02e34d15710" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 04:02:26 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Updated the network info_cache for instance {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 23 04:02:26 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:02:27 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:02:27 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:02:27 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:02:28 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:02:29 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._sync_power_states {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:02:29 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Triggering sync for uuid ee89a9d5-0507-475c-bea2-e02e34d15710 {{(pid=71428) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 23 04:02:29 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Triggering sync for uuid 27abe5af-8845-40e6-a9c3-12399369b117 {{(pid=71428) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 23 04:02:29 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Triggering sync for uuid 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23 {{(pid=71428) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 23 04:02:29 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "ee89a9d5-0507-475c-bea2-e02e34d15710" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:02:29 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "ee89a9d5-0507-475c-bea2-e02e34d15710" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:02:29 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "27abe5af-8845-40e6-a9c3-12399369b117" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:02:29 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "27abe5af-8845-40e6-a9c3-12399369b117" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:02:29 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:02:29 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:02:29 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "27abe5af-8845-40e6-a9c3-12399369b117" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.031s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:02:29 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.032s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:02:29 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "ee89a9d5-0507-475c-bea2-e02e34d15710" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.053s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:02:31 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:02:33 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:02:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:02:40 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:02:40 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Cleaning up deleted instances {{(pid=71428) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 23 04:02:40 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] There are 0 instances to clean {{(pid=71428) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 23 04:02:43 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:02:48 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:02:48 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:02:48 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71428) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 23 04:02:48 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 04:02:48 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 04:02:48 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:02:53 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:02:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:02:59 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:03:03 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:03:08 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:03:13 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:03:16 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:03:18 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:03:19 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:03:19 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71428) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 23 04:03:20 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:03:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:03:23 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:03:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:03:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:03:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:03:23 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Auditing locally available compute resources for user (node: user) {{(pid=71428) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 23 04:03:23 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk.rescue --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:03:23 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk.rescue --force-share --output=json" returned: 0 in 0.138s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:03:23 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk.rescue --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:03:24 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk.rescue --force-share --output=json" returned: 0 in 0.130s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:03:24 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:03:24 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:03:24 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:03:24 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:03:24 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/27abe5af-8845-40e6-a9c3-12399369b117/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:03:24 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/27abe5af-8845-40e6-a9c3-12399369b117/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:03:24 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/27abe5af-8845-40e6-a9c3-12399369b117/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:03:24 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/27abe5af-8845-40e6-a9c3-12399369b117/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:03:24 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:03:24 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk --force-share --output=json" returned: 0 in 0.146s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:03:24 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:03:24 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk --force-share --output=json" returned: 0 in 0.163s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:03:25 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 04:03:25 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 04:03:25 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Hypervisor/Node resource view: name=user free_ram=8902MB free_disk=26.208938598632812GB free_vcpus=9 pci_devices=[{"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71428) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 23 04:03:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:03:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:03:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-15edb0a1-b10a-403d-9178-a79b05664efe tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Acquiring lock "7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:03:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-15edb0a1-b10a-403d-9178-a79b05664efe tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lock "7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:03:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-15edb0a1-b10a-403d-9178-a79b05664efe tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Acquiring lock "7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:03:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-15edb0a1-b10a-403d-9178-a79b05664efe tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lock "7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:03:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-15edb0a1-b10a-403d-9178-a79b05664efe tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lock "7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:03:25 user nova-compute[71428]: INFO nova.compute.manager [None req-15edb0a1-b10a-403d-9178-a79b05664efe tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Terminating instance Apr 23 04:03:25 user nova-compute[71428]: DEBUG nova.compute.manager [None req-15edb0a1-b10a-403d-9178-a79b05664efe tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Start destroying the instance on the hypervisor. {{(pid=71428) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 23 04:03:25 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:03:25 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance ee89a9d5-0507-475c-bea2-e02e34d15710 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 04:03:25 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 27abe5af-8845-40e6-a9c3-12399369b117 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 04:03:25 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 04:03:25 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Total usable vcpus: 12, total allocated vcpus: 3 {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 23 04:03:25 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Final resource view: name=user phys_ram=16023MB used_ram=896MB phys_disk=40GB used_disk=3GB total_vcpus=12 used_vcpus=3 pci_stats=[] {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 23 04:03:25 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Refreshing inventories for resource provider 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 23 04:03:25 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Updating ProviderTree inventory for provider 3017e09c-9289-4a8e-8061-3ff90149e985 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 23 04:03:25 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Updating inventory in ProviderTree for provider 3017e09c-9289-4a8e-8061-3ff90149e985 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 23 04:03:25 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Refreshing aggregate associations for resource provider 3017e09c-9289-4a8e-8061-3ff90149e985, aggregates: None {{(pid=71428) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 23 04:03:25 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Refreshing trait associations for resource provider 3017e09c-9289-4a8e-8061-3ff90149e985, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE41,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_RESCUE_BFV,COMPUTE_NODE,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_GRAPHICS_MODEL_CIRRUS {{(pid=71428) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 23 04:03:25 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:03:25 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 04:03:25 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:03:25 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 04:03:25 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Compute_service record updated for user:user {{(pid=71428) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 23 04:03:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.420s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:03:25 user nova-compute[71428]: DEBUG nova.compute.manager [req-1040585d-4bd7-4baf-98d5-b3f9aa0a7f0b req-0bf3000e-23f8-4d4e-bc08-2baa2764f25b service nova] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Received event network-vif-unplugged-c79b33fd-e966-4247-a5e7-3b6ee20d635c {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:03:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-1040585d-4bd7-4baf-98d5-b3f9aa0a7f0b req-0bf3000e-23f8-4d4e-bc08-2baa2764f25b service nova] Acquiring lock "7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:03:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-1040585d-4bd7-4baf-98d5-b3f9aa0a7f0b req-0bf3000e-23f8-4d4e-bc08-2baa2764f25b service nova] Lock "7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:03:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-1040585d-4bd7-4baf-98d5-b3f9aa0a7f0b req-0bf3000e-23f8-4d4e-bc08-2baa2764f25b service nova] Lock "7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:03:25 user nova-compute[71428]: DEBUG nova.compute.manager [req-1040585d-4bd7-4baf-98d5-b3f9aa0a7f0b req-0bf3000e-23f8-4d4e-bc08-2baa2764f25b service nova] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] No waiting events found dispatching network-vif-unplugged-c79b33fd-e966-4247-a5e7-3b6ee20d635c {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 04:03:25 user nova-compute[71428]: DEBUG nova.compute.manager [req-1040585d-4bd7-4baf-98d5-b3f9aa0a7f0b req-0bf3000e-23f8-4d4e-bc08-2baa2764f25b service nova] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Received event network-vif-unplugged-c79b33fd-e966-4247-a5e7-3b6ee20d635c for instance with task_state deleting. {{(pid=71428) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 23 04:03:25 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:03:25 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Instance destroyed successfully. Apr 23 04:03:25 user nova-compute[71428]: DEBUG nova.objects.instance [None req-15edb0a1-b10a-403d-9178-a79b05664efe tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lazy-loading 'resources' on Instance uuid 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 04:03:25 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-15edb0a1-b10a-403d-9178-a79b05664efe tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:58:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1609850900',display_name='tempest-ServerRescueNegativeTestJSON-server-1609850900',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-1609850900',id=22,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-23T04:00:36Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='db4b7bdc478946e2ad0a2060b433b42c',ramdisk_id='',reservation_id='r-zpb2ujai',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerRescueNegativeTestJSON-509507797',owner_user_name='tempest-ServerRescueNegativeTestJSON-509507797-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-23T04:00:36Z,user_data=None,user_id='84b2f522c7164c5c934b8bbf113b542a',uuid=7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23,vcpu_model=,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "c79b33fd-e966-4247-a5e7-3b6ee20d635c", "address": "fa:16:3e:04:78:26", "network": {"id": "5aff0e90-d0f3-4716-9f61-98dc8c049108", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-790345692-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "db4b7bdc478946e2ad0a2060b433b42c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc79b33fd-e9", "ovs_interfaceid": "c79b33fd-e966-4247-a5e7-3b6ee20d635c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 23 04:03:25 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-15edb0a1-b10a-403d-9178-a79b05664efe tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Converting VIF {"id": "c79b33fd-e966-4247-a5e7-3b6ee20d635c", "address": "fa:16:3e:04:78:26", "network": {"id": "5aff0e90-d0f3-4716-9f61-98dc8c049108", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-790345692-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "db4b7bdc478946e2ad0a2060b433b42c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc79b33fd-e9", "ovs_interfaceid": "c79b33fd-e966-4247-a5e7-3b6ee20d635c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 04:03:25 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-15edb0a1-b10a-403d-9178-a79b05664efe tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:04:78:26,bridge_name='br-int',has_traffic_filtering=True,id=c79b33fd-e966-4247-a5e7-3b6ee20d635c,network=Network(5aff0e90-d0f3-4716-9f61-98dc8c049108),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc79b33fd-e9') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 04:03:25 user nova-compute[71428]: DEBUG os_vif [None req-15edb0a1-b10a-403d-9178-a79b05664efe tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:04:78:26,bridge_name='br-int',has_traffic_filtering=True,id=c79b33fd-e966-4247-a5e7-3b6ee20d635c,network=Network(5aff0e90-d0f3-4716-9f61-98dc8c049108),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc79b33fd-e9') {{(pid=71428) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 23 04:03:25 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:03:25 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc79b33fd-e9, bridge=br-int, if_exists=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 04:03:26 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:03:26 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:03:26 user nova-compute[71428]: INFO os_vif [None req-15edb0a1-b10a-403d-9178-a79b05664efe tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:04:78:26,bridge_name='br-int',has_traffic_filtering=True,id=c79b33fd-e966-4247-a5e7-3b6ee20d635c,network=Network(5aff0e90-d0f3-4716-9f61-98dc8c049108),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc79b33fd-e9') Apr 23 04:03:26 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-15edb0a1-b10a-403d-9178-a79b05664efe tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Deleting instance files /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23_del Apr 23 04:03:26 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-15edb0a1-b10a-403d-9178-a79b05664efe tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Deletion of /opt/stack/data/nova/instances/7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23_del complete Apr 23 04:03:26 user nova-compute[71428]: INFO nova.compute.manager [None req-15edb0a1-b10a-403d-9178-a79b05664efe tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Took 0.69 seconds to destroy the instance on the hypervisor. Apr 23 04:03:26 user nova-compute[71428]: DEBUG oslo.service.loopingcall [None req-15edb0a1-b10a-403d-9178-a79b05664efe tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71428) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 23 04:03:26 user nova-compute[71428]: DEBUG nova.compute.manager [-] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Deallocating network for instance {{(pid=71428) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 23 04:03:26 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] deallocate_for_instance() {{(pid=71428) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 23 04:03:26 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Updating instance_info_cache with network_info: [] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 04:03:26 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Took 0.61 seconds to deallocate network for instance. Apr 23 04:03:26 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-15edb0a1-b10a-403d-9178-a79b05664efe tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:03:26 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-15edb0a1-b10a-403d-9178-a79b05664efe tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:03:26 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:03:26 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Starting heal instance info cache {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 23 04:03:26 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "refresh_cache-27abe5af-8845-40e6-a9c3-12399369b117" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 04:03:26 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquired lock "refresh_cache-27abe5af-8845-40e6-a9c3-12399369b117" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 04:03:26 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Forcefully refreshing network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 23 04:03:26 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-15edb0a1-b10a-403d-9178-a79b05664efe tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 04:03:26 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-15edb0a1-b10a-403d-9178-a79b05664efe tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 04:03:27 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-15edb0a1-b10a-403d-9178-a79b05664efe tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.478s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:03:27 user nova-compute[71428]: INFO nova.scheduler.client.report [None req-15edb0a1-b10a-403d-9178-a79b05664efe tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Deleted allocations for instance 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23 Apr 23 04:03:27 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-15edb0a1-b10a-403d-9178-a79b05664efe tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lock "7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.987s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:03:27 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Updating instance_info_cache with network_info: [{"id": "85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d", "address": "fa:16:3e:ae:5c:9d", "network": {"id": "5aff0e90-d0f3-4716-9f61-98dc8c049108", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-790345692-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "db4b7bdc478946e2ad0a2060b433b42c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap85c6dc8b-d8", "ovs_interfaceid": "85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 04:03:27 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Releasing lock "refresh_cache-27abe5af-8845-40e6-a9c3-12399369b117" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 04:03:27 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Updated the network info_cache for instance {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 23 04:03:27 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:03:27 user nova-compute[71428]: DEBUG nova.compute.manager [req-0f29b9b8-dd98-4c6a-a303-a9de2bf821d2 req-473b7f03-f489-497b-ba9a-e8e44235a4c5 service nova] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Received event network-vif-plugged-c79b33fd-e966-4247-a5e7-3b6ee20d635c {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:03:27 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-0f29b9b8-dd98-4c6a-a303-a9de2bf821d2 req-473b7f03-f489-497b-ba9a-e8e44235a4c5 service nova] Acquiring lock "7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:03:27 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-0f29b9b8-dd98-4c6a-a303-a9de2bf821d2 req-473b7f03-f489-497b-ba9a-e8e44235a4c5 service nova] Lock "7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:03:27 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-0f29b9b8-dd98-4c6a-a303-a9de2bf821d2 req-473b7f03-f489-497b-ba9a-e8e44235a4c5 service nova] Lock "7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:03:27 user nova-compute[71428]: DEBUG nova.compute.manager [req-0f29b9b8-dd98-4c6a-a303-a9de2bf821d2 req-473b7f03-f489-497b-ba9a-e8e44235a4c5 service nova] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] No waiting events found dispatching network-vif-plugged-c79b33fd-e966-4247-a5e7-3b6ee20d635c {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 04:03:27 user nova-compute[71428]: WARNING nova.compute.manager [req-0f29b9b8-dd98-4c6a-a303-a9de2bf821d2 req-473b7f03-f489-497b-ba9a-e8e44235a4c5 service nova] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Received unexpected event network-vif-plugged-c79b33fd-e966-4247-a5e7-3b6ee20d635c for instance with vm_state deleted and task_state None. Apr 23 04:03:27 user nova-compute[71428]: DEBUG nova.compute.manager [req-0f29b9b8-dd98-4c6a-a303-a9de2bf821d2 req-473b7f03-f489-497b-ba9a-e8e44235a4c5 service nova] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Received event network-vif-deleted-c79b33fd-e966-4247-a5e7-3b6ee20d635c {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:03:28 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:03:29 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:03:31 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:03:32 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:03:36 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:03:40 user nova-compute[71428]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 04:03:40 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] VM Stopped (Lifecycle Event) Apr 23 04:03:40 user nova-compute[71428]: DEBUG nova.compute.manager [None req-397ea214-0960-4ba9-b52e-fdf310ade5b0 None None] [instance: 7d5ef4db-70d7-4545-9ddb-92ecdf4ebb23] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 04:03:41 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:03:46 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:03:51 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:03:56 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:03:56 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:03:56 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71428) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 23 04:03:56 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 04:03:56 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 04:03:56 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:04:01 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:04:06 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:04:11 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:04:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-0338fc77-aede-436c-9a0d-2f02fc1bffbc tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Acquiring lock "27abe5af-8845-40e6-a9c3-12399369b117" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:04:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-0338fc77-aede-436c-9a0d-2f02fc1bffbc tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lock "27abe5af-8845-40e6-a9c3-12399369b117" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:04:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-0338fc77-aede-436c-9a0d-2f02fc1bffbc tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Acquiring lock "27abe5af-8845-40e6-a9c3-12399369b117-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:04:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-0338fc77-aede-436c-9a0d-2f02fc1bffbc tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lock "27abe5af-8845-40e6-a9c3-12399369b117-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:04:15 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-0338fc77-aede-436c-9a0d-2f02fc1bffbc tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lock "27abe5af-8845-40e6-a9c3-12399369b117-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:04:15 user nova-compute[71428]: INFO nova.compute.manager [None req-0338fc77-aede-436c-9a0d-2f02fc1bffbc tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Terminating instance Apr 23 04:04:15 user nova-compute[71428]: DEBUG nova.compute.manager [None req-0338fc77-aede-436c-9a0d-2f02fc1bffbc tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Start destroying the instance on the hypervisor. {{(pid=71428) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 23 04:04:15 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:04:15 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:04:15 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:04:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:04:16 user nova-compute[71428]: DEBUG nova.compute.manager [req-9b0ee0b8-ff45-44fa-b95b-136f49b2fad4 req-6af5f681-77a1-430a-bdd4-b10e012f3762 service nova] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Received event network-vif-unplugged-85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:04:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-9b0ee0b8-ff45-44fa-b95b-136f49b2fad4 req-6af5f681-77a1-430a-bdd4-b10e012f3762 service nova] Acquiring lock "27abe5af-8845-40e6-a9c3-12399369b117-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:04:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-9b0ee0b8-ff45-44fa-b95b-136f49b2fad4 req-6af5f681-77a1-430a-bdd4-b10e012f3762 service nova] Lock "27abe5af-8845-40e6-a9c3-12399369b117-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:04:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-9b0ee0b8-ff45-44fa-b95b-136f49b2fad4 req-6af5f681-77a1-430a-bdd4-b10e012f3762 service nova] Lock "27abe5af-8845-40e6-a9c3-12399369b117-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:04:16 user nova-compute[71428]: DEBUG nova.compute.manager [req-9b0ee0b8-ff45-44fa-b95b-136f49b2fad4 req-6af5f681-77a1-430a-bdd4-b10e012f3762 service nova] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] No waiting events found dispatching network-vif-unplugged-85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 04:04:16 user nova-compute[71428]: DEBUG nova.compute.manager [req-9b0ee0b8-ff45-44fa-b95b-136f49b2fad4 req-6af5f681-77a1-430a-bdd4-b10e012f3762 service nova] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Received event network-vif-unplugged-85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d for instance with task_state deleting. {{(pid=71428) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 23 04:04:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:04:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:04:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:04:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:04:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:04:16 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Instance destroyed successfully. Apr 23 04:04:16 user nova-compute[71428]: DEBUG nova.objects.instance [None req-0338fc77-aede-436c-9a0d-2f02fc1bffbc tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lazy-loading 'resources' on Instance uuid 27abe5af-8845-40e6-a9c3-12399369b117 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 04:04:16 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-0338fc77-aede-436c-9a0d-2f02fc1bffbc tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:58:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1213396490',display_name='tempest-ServerRescueNegativeTestJSON-server-1213396490',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-1213396490',id=21,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-23T03:58:47Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='db4b7bdc478946e2ad0a2060b433b42c',ramdisk_id='',reservation_id='r-u91j4iwy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerRescueNegativeTestJSON-509507797',owner_user_name='tempest-ServerRescueNegativeTestJSON-509507797-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-23T03:58:48Z,user_data=None,user_id='84b2f522c7164c5c934b8bbf113b542a',uuid=27abe5af-8845-40e6-a9c3-12399369b117,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d", "address": "fa:16:3e:ae:5c:9d", "network": {"id": "5aff0e90-d0f3-4716-9f61-98dc8c049108", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-790345692-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "db4b7bdc478946e2ad0a2060b433b42c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap85c6dc8b-d8", "ovs_interfaceid": "85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 23 04:04:16 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-0338fc77-aede-436c-9a0d-2f02fc1bffbc tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Converting VIF {"id": "85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d", "address": "fa:16:3e:ae:5c:9d", "network": {"id": "5aff0e90-d0f3-4716-9f61-98dc8c049108", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-790345692-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "db4b7bdc478946e2ad0a2060b433b42c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap85c6dc8b-d8", "ovs_interfaceid": "85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 04:04:16 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-0338fc77-aede-436c-9a0d-2f02fc1bffbc tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ae:5c:9d,bridge_name='br-int',has_traffic_filtering=True,id=85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d,network=Network(5aff0e90-d0f3-4716-9f61-98dc8c049108),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap85c6dc8b-d8') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 04:04:16 user nova-compute[71428]: DEBUG os_vif [None req-0338fc77-aede-436c-9a0d-2f02fc1bffbc tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ae:5c:9d,bridge_name='br-int',has_traffic_filtering=True,id=85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d,network=Network(5aff0e90-d0f3-4716-9f61-98dc8c049108),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap85c6dc8b-d8') {{(pid=71428) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 23 04:04:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:04:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap85c6dc8b-d8, bridge=br-int, if_exists=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 04:04:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:04:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:04:16 user nova-compute[71428]: INFO os_vif [None req-0338fc77-aede-436c-9a0d-2f02fc1bffbc tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ae:5c:9d,bridge_name='br-int',has_traffic_filtering=True,id=85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d,network=Network(5aff0e90-d0f3-4716-9f61-98dc8c049108),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap85c6dc8b-d8') Apr 23 04:04:16 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-0338fc77-aede-436c-9a0d-2f02fc1bffbc tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Deleting instance files /opt/stack/data/nova/instances/27abe5af-8845-40e6-a9c3-12399369b117_del Apr 23 04:04:16 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-0338fc77-aede-436c-9a0d-2f02fc1bffbc tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Deletion of /opt/stack/data/nova/instances/27abe5af-8845-40e6-a9c3-12399369b117_del complete Apr 23 04:04:16 user nova-compute[71428]: INFO nova.compute.manager [None req-0338fc77-aede-436c-9a0d-2f02fc1bffbc tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Took 0.84 seconds to destroy the instance on the hypervisor. Apr 23 04:04:16 user nova-compute[71428]: DEBUG oslo.service.loopingcall [None req-0338fc77-aede-436c-9a0d-2f02fc1bffbc tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71428) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 23 04:04:16 user nova-compute[71428]: DEBUG nova.compute.manager [-] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Deallocating network for instance {{(pid=71428) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 23 04:04:16 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] deallocate_for_instance() {{(pid=71428) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 23 04:04:16 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:04:17 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:04:17 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:04:17 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:04:17 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:04:17 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Updating instance_info_cache with network_info: [] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 04:04:17 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Took 0.92 seconds to deallocate network for instance. Apr 23 04:04:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-0338fc77-aede-436c-9a0d-2f02fc1bffbc tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:04:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-0338fc77-aede-436c-9a0d-2f02fc1bffbc tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:04:17 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-0338fc77-aede-436c-9a0d-2f02fc1bffbc tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 04:04:17 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-0338fc77-aede-436c-9a0d-2f02fc1bffbc tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 04:04:17 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-0338fc77-aede-436c-9a0d-2f02fc1bffbc tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.141s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:04:17 user nova-compute[71428]: INFO nova.scheduler.client.report [None req-0338fc77-aede-436c-9a0d-2f02fc1bffbc tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Deleted allocations for instance 27abe5af-8845-40e6-a9c3-12399369b117 Apr 23 04:04:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-0338fc77-aede-436c-9a0d-2f02fc1bffbc tempest-ServerRescueNegativeTestJSON-509507797 tempest-ServerRescueNegativeTestJSON-509507797-project-member] Lock "27abe5af-8845-40e6-a9c3-12399369b117" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.225s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:04:18 user nova-compute[71428]: DEBUG nova.compute.manager [req-104d64c3-e653-48dd-89ea-68134af1f562 req-85020bf0-87bf-4e21-8fe0-0b2751b167e6 service nova] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Received event network-vif-plugged-85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:04:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-104d64c3-e653-48dd-89ea-68134af1f562 req-85020bf0-87bf-4e21-8fe0-0b2751b167e6 service nova] Acquiring lock "27abe5af-8845-40e6-a9c3-12399369b117-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:04:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-104d64c3-e653-48dd-89ea-68134af1f562 req-85020bf0-87bf-4e21-8fe0-0b2751b167e6 service nova] Lock "27abe5af-8845-40e6-a9c3-12399369b117-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:04:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-104d64c3-e653-48dd-89ea-68134af1f562 req-85020bf0-87bf-4e21-8fe0-0b2751b167e6 service nova] Lock "27abe5af-8845-40e6-a9c3-12399369b117-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:04:18 user nova-compute[71428]: DEBUG nova.compute.manager [req-104d64c3-e653-48dd-89ea-68134af1f562 req-85020bf0-87bf-4e21-8fe0-0b2751b167e6 service nova] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] No waiting events found dispatching network-vif-plugged-85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 04:04:18 user nova-compute[71428]: WARNING nova.compute.manager [req-104d64c3-e653-48dd-89ea-68134af1f562 req-85020bf0-87bf-4e21-8fe0-0b2751b167e6 service nova] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Received unexpected event network-vif-plugged-85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d for instance with vm_state deleted and task_state None. Apr 23 04:04:18 user nova-compute[71428]: DEBUG nova.compute.manager [req-104d64c3-e653-48dd-89ea-68134af1f562 req-85020bf0-87bf-4e21-8fe0-0b2751b167e6 service nova] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Received event network-vif-plugged-85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:04:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-104d64c3-e653-48dd-89ea-68134af1f562 req-85020bf0-87bf-4e21-8fe0-0b2751b167e6 service nova] Acquiring lock "27abe5af-8845-40e6-a9c3-12399369b117-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:04:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-104d64c3-e653-48dd-89ea-68134af1f562 req-85020bf0-87bf-4e21-8fe0-0b2751b167e6 service nova] Lock "27abe5af-8845-40e6-a9c3-12399369b117-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:04:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-104d64c3-e653-48dd-89ea-68134af1f562 req-85020bf0-87bf-4e21-8fe0-0b2751b167e6 service nova] Lock "27abe5af-8845-40e6-a9c3-12399369b117-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:04:18 user nova-compute[71428]: DEBUG nova.compute.manager [req-104d64c3-e653-48dd-89ea-68134af1f562 req-85020bf0-87bf-4e21-8fe0-0b2751b167e6 service nova] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] No waiting events found dispatching network-vif-plugged-85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 04:04:18 user nova-compute[71428]: WARNING nova.compute.manager [req-104d64c3-e653-48dd-89ea-68134af1f562 req-85020bf0-87bf-4e21-8fe0-0b2751b167e6 service nova] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Received unexpected event network-vif-plugged-85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d for instance with vm_state deleted and task_state None. Apr 23 04:04:18 user nova-compute[71428]: DEBUG nova.compute.manager [req-104d64c3-e653-48dd-89ea-68134af1f562 req-85020bf0-87bf-4e21-8fe0-0b2751b167e6 service nova] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Received event network-vif-plugged-85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:04:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-104d64c3-e653-48dd-89ea-68134af1f562 req-85020bf0-87bf-4e21-8fe0-0b2751b167e6 service nova] Acquiring lock "27abe5af-8845-40e6-a9c3-12399369b117-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:04:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-104d64c3-e653-48dd-89ea-68134af1f562 req-85020bf0-87bf-4e21-8fe0-0b2751b167e6 service nova] Lock "27abe5af-8845-40e6-a9c3-12399369b117-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:04:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-104d64c3-e653-48dd-89ea-68134af1f562 req-85020bf0-87bf-4e21-8fe0-0b2751b167e6 service nova] Lock "27abe5af-8845-40e6-a9c3-12399369b117-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:04:18 user nova-compute[71428]: DEBUG nova.compute.manager [req-104d64c3-e653-48dd-89ea-68134af1f562 req-85020bf0-87bf-4e21-8fe0-0b2751b167e6 service nova] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] No waiting events found dispatching network-vif-plugged-85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 04:04:18 user nova-compute[71428]: WARNING nova.compute.manager [req-104d64c3-e653-48dd-89ea-68134af1f562 req-85020bf0-87bf-4e21-8fe0-0b2751b167e6 service nova] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Received unexpected event network-vif-plugged-85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d for instance with vm_state deleted and task_state None. Apr 23 04:04:18 user nova-compute[71428]: DEBUG nova.compute.manager [req-104d64c3-e653-48dd-89ea-68134af1f562 req-85020bf0-87bf-4e21-8fe0-0b2751b167e6 service nova] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Received event network-vif-plugged-85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:04:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-104d64c3-e653-48dd-89ea-68134af1f562 req-85020bf0-87bf-4e21-8fe0-0b2751b167e6 service nova] Acquiring lock "27abe5af-8845-40e6-a9c3-12399369b117-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:04:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-104d64c3-e653-48dd-89ea-68134af1f562 req-85020bf0-87bf-4e21-8fe0-0b2751b167e6 service nova] Lock "27abe5af-8845-40e6-a9c3-12399369b117-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:04:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-104d64c3-e653-48dd-89ea-68134af1f562 req-85020bf0-87bf-4e21-8fe0-0b2751b167e6 service nova] Lock "27abe5af-8845-40e6-a9c3-12399369b117-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:04:18 user nova-compute[71428]: DEBUG nova.compute.manager [req-104d64c3-e653-48dd-89ea-68134af1f562 req-85020bf0-87bf-4e21-8fe0-0b2751b167e6 service nova] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] No waiting events found dispatching network-vif-plugged-85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 04:04:18 user nova-compute[71428]: WARNING nova.compute.manager [req-104d64c3-e653-48dd-89ea-68134af1f562 req-85020bf0-87bf-4e21-8fe0-0b2751b167e6 service nova] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Received unexpected event network-vif-plugged-85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d for instance with vm_state deleted and task_state None. Apr 23 04:04:18 user nova-compute[71428]: DEBUG nova.compute.manager [req-104d64c3-e653-48dd-89ea-68134af1f562 req-85020bf0-87bf-4e21-8fe0-0b2751b167e6 service nova] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Received event network-vif-deleted-85c6dc8b-d8a7-4df1-a8a5-667f7d80e42d {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:04:21 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:04:21 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:04:21 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:04:21 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71428) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 23 04:04:23 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:04:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:04:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:04:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:04:23 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Auditing locally available compute resources for user (node: user) {{(pid=71428) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 23 04:04:23 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:04:23 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:04:23 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:04:24 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:04:24 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 04:04:24 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 04:04:24 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Hypervisor/Node resource view: name=user free_ram=8993MB free_disk=26.266246795654297GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71428) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 23 04:04:24 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:04:24 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:04:24 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance ee89a9d5-0507-475c-bea2-e02e34d15710 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 04:04:24 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 23 04:04:24 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 23 04:04:24 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 04:04:24 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 04:04:24 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Compute_service record updated for user:user {{(pid=71428) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 23 04:04:24 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.181s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:04:25 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:04:26 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:04:26 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:04:26 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Starting heal instance info cache {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 23 04:04:26 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Didn't find any instances for network info cache update. {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 23 04:04:29 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:04:30 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:04:30 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:04:31 user nova-compute[71428]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 04:04:31 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] VM Stopped (Lifecycle Event) Apr 23 04:04:31 user nova-compute[71428]: DEBUG nova.compute.manager [None req-2a83ebf8-89a3-48cf-855e-ca149da9b991 None None] [instance: 27abe5af-8845-40e6-a9c3-12399369b117] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 04:04:31 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:04:31 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:04:31 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71428) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 23 04:04:31 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 04:04:31 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 04:04:31 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:04:32 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:04:36 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:04:41 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:04:46 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:04:51 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:04:56 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:04:56 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:04:56 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71428) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 23 04:04:56 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 04:04:56 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 04:04:56 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:04:57 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-b6769401-d9f8-4e3a-99a4-1f8adf77fc4c tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Acquiring lock "ee89a9d5-0507-475c-bea2-e02e34d15710" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:04:57 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-b6769401-d9f8-4e3a-99a4-1f8adf77fc4c tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lock "ee89a9d5-0507-475c-bea2-e02e34d15710" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:04:57 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-b6769401-d9f8-4e3a-99a4-1f8adf77fc4c tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Acquiring lock "ee89a9d5-0507-475c-bea2-e02e34d15710-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:04:57 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-b6769401-d9f8-4e3a-99a4-1f8adf77fc4c tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lock "ee89a9d5-0507-475c-bea2-e02e34d15710-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:04:57 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-b6769401-d9f8-4e3a-99a4-1f8adf77fc4c tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lock "ee89a9d5-0507-475c-bea2-e02e34d15710-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:04:57 user nova-compute[71428]: INFO nova.compute.manager [None req-b6769401-d9f8-4e3a-99a4-1f8adf77fc4c tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Terminating instance Apr 23 04:04:57 user nova-compute[71428]: DEBUG nova.compute.manager [None req-b6769401-d9f8-4e3a-99a4-1f8adf77fc4c tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Start destroying the instance on the hypervisor. {{(pid=71428) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 23 04:04:57 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:04:57 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:04:57 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:04:58 user nova-compute[71428]: DEBUG nova.compute.manager [req-a93e5c6e-b07c-40bc-b330-896e0ca0f0f0 req-4fe71157-d01f-440b-8e52-265bf1e62f75 service nova] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Received event network-vif-unplugged-8948ed3a-de42-41ef-b7ca-c34a2d405b09 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:04:58 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-a93e5c6e-b07c-40bc-b330-896e0ca0f0f0 req-4fe71157-d01f-440b-8e52-265bf1e62f75 service nova] Acquiring lock "ee89a9d5-0507-475c-bea2-e02e34d15710-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:04:58 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-a93e5c6e-b07c-40bc-b330-896e0ca0f0f0 req-4fe71157-d01f-440b-8e52-265bf1e62f75 service nova] Lock "ee89a9d5-0507-475c-bea2-e02e34d15710-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:04:58 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-a93e5c6e-b07c-40bc-b330-896e0ca0f0f0 req-4fe71157-d01f-440b-8e52-265bf1e62f75 service nova] Lock "ee89a9d5-0507-475c-bea2-e02e34d15710-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:04:58 user nova-compute[71428]: DEBUG nova.compute.manager [req-a93e5c6e-b07c-40bc-b330-896e0ca0f0f0 req-4fe71157-d01f-440b-8e52-265bf1e62f75 service nova] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] No waiting events found dispatching network-vif-unplugged-8948ed3a-de42-41ef-b7ca-c34a2d405b09 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 04:04:58 user nova-compute[71428]: DEBUG nova.compute.manager [req-a93e5c6e-b07c-40bc-b330-896e0ca0f0f0 req-4fe71157-d01f-440b-8e52-265bf1e62f75 service nova] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Received event network-vif-unplugged-8948ed3a-de42-41ef-b7ca-c34a2d405b09 for instance with task_state deleting. {{(pid=71428) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 23 04:04:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:04:58 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Instance destroyed successfully. Apr 23 04:04:58 user nova-compute[71428]: DEBUG nova.objects.instance [None req-b6769401-d9f8-4e3a-99a4-1f8adf77fc4c tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lazy-loading 'resources' on Instance uuid ee89a9d5-0507-475c-bea2-e02e34d15710 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 04:04:58 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-b6769401-d9f8-4e3a-99a4-1f8adf77fc4c tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T03:56:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1352466124',display_name='tempest-ServersNegativeTestJSON-server-1352466124',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1352466124',id=19,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-23T03:56:26Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='39d6cb4cfcb2413f8039ee9febf1c046',ramdisk_id='',reservation_id='r-aw5j0sob',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServersNegativeTestJSON-1115615559',owner_user_name='tempest-ServersNegativeTestJSON-1115615559-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-23T03:56:27Z,user_data=None,user_id='45d9ae4a3b00492fa2fc16a22a34df09',uuid=ee89a9d5-0507-475c-bea2-e02e34d15710,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8948ed3a-de42-41ef-b7ca-c34a2d405b09", "address": "fa:16:3e:d2:42:a8", "network": {"id": "083ef998-5361-46b7-827f-d0f086e9e552", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1112168066-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "39d6cb4cfcb2413f8039ee9febf1c046", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8948ed3a-de", "ovs_interfaceid": "8948ed3a-de42-41ef-b7ca-c34a2d405b09", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 23 04:04:58 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-b6769401-d9f8-4e3a-99a4-1f8adf77fc4c tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Converting VIF {"id": "8948ed3a-de42-41ef-b7ca-c34a2d405b09", "address": "fa:16:3e:d2:42:a8", "network": {"id": "083ef998-5361-46b7-827f-d0f086e9e552", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1112168066-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "39d6cb4cfcb2413f8039ee9febf1c046", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8948ed3a-de", "ovs_interfaceid": "8948ed3a-de42-41ef-b7ca-c34a2d405b09", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 04:04:58 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-b6769401-d9f8-4e3a-99a4-1f8adf77fc4c tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d2:42:a8,bridge_name='br-int',has_traffic_filtering=True,id=8948ed3a-de42-41ef-b7ca-c34a2d405b09,network=Network(083ef998-5361-46b7-827f-d0f086e9e552),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8948ed3a-de') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 04:04:58 user nova-compute[71428]: DEBUG os_vif [None req-b6769401-d9f8-4e3a-99a4-1f8adf77fc4c tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d2:42:a8,bridge_name='br-int',has_traffic_filtering=True,id=8948ed3a-de42-41ef-b7ca-c34a2d405b09,network=Network(083ef998-5361-46b7-827f-d0f086e9e552),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8948ed3a-de') {{(pid=71428) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 23 04:04:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:04:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8948ed3a-de, bridge=br-int, if_exists=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 04:04:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:04:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:04:58 user nova-compute[71428]: INFO os_vif [None req-b6769401-d9f8-4e3a-99a4-1f8adf77fc4c tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d2:42:a8,bridge_name='br-int',has_traffic_filtering=True,id=8948ed3a-de42-41ef-b7ca-c34a2d405b09,network=Network(083ef998-5361-46b7-827f-d0f086e9e552),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8948ed3a-de') Apr 23 04:04:58 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-b6769401-d9f8-4e3a-99a4-1f8adf77fc4c tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Deleting instance files /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710_del Apr 23 04:04:58 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-b6769401-d9f8-4e3a-99a4-1f8adf77fc4c tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Deletion of /opt/stack/data/nova/instances/ee89a9d5-0507-475c-bea2-e02e34d15710_del complete Apr 23 04:04:58 user nova-compute[71428]: INFO nova.compute.manager [None req-b6769401-d9f8-4e3a-99a4-1f8adf77fc4c tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Took 0.66 seconds to destroy the instance on the hypervisor. Apr 23 04:04:58 user nova-compute[71428]: DEBUG oslo.service.loopingcall [None req-b6769401-d9f8-4e3a-99a4-1f8adf77fc4c tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71428) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 23 04:04:58 user nova-compute[71428]: DEBUG nova.compute.manager [-] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Deallocating network for instance {{(pid=71428) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 23 04:04:58 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] deallocate_for_instance() {{(pid=71428) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 23 04:04:59 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Updating instance_info_cache with network_info: [] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 04:04:59 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Took 0.49 seconds to deallocate network for instance. Apr 23 04:04:59 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-b6769401-d9f8-4e3a-99a4-1f8adf77fc4c tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:04:59 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-b6769401-d9f8-4e3a-99a4-1f8adf77fc4c tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:04:59 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-b6769401-d9f8-4e3a-99a4-1f8adf77fc4c tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 04:04:59 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-b6769401-d9f8-4e3a-99a4-1f8adf77fc4c tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 04:04:59 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-b6769401-d9f8-4e3a-99a4-1f8adf77fc4c tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.114s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:04:59 user nova-compute[71428]: INFO nova.scheduler.client.report [None req-b6769401-d9f8-4e3a-99a4-1f8adf77fc4c tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Deleted allocations for instance ee89a9d5-0507-475c-bea2-e02e34d15710 Apr 23 04:04:59 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-b6769401-d9f8-4e3a-99a4-1f8adf77fc4c tempest-ServersNegativeTestJSON-1115615559 tempest-ServersNegativeTestJSON-1115615559-project-member] Lock "ee89a9d5-0507-475c-bea2-e02e34d15710" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.468s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:05:00 user nova-compute[71428]: DEBUG nova.compute.manager [req-0d09e7ee-c2ea-46c1-ab96-5f0d5a77e27d req-077439ed-062d-44c5-a6e8-b398d29a6393 service nova] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Received event network-vif-plugged-8948ed3a-de42-41ef-b7ca-c34a2d405b09 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:05:00 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-0d09e7ee-c2ea-46c1-ab96-5f0d5a77e27d req-077439ed-062d-44c5-a6e8-b398d29a6393 service nova] Acquiring lock "ee89a9d5-0507-475c-bea2-e02e34d15710-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:05:00 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-0d09e7ee-c2ea-46c1-ab96-5f0d5a77e27d req-077439ed-062d-44c5-a6e8-b398d29a6393 service nova] Lock "ee89a9d5-0507-475c-bea2-e02e34d15710-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:05:00 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-0d09e7ee-c2ea-46c1-ab96-5f0d5a77e27d req-077439ed-062d-44c5-a6e8-b398d29a6393 service nova] Lock "ee89a9d5-0507-475c-bea2-e02e34d15710-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:05:00 user nova-compute[71428]: DEBUG nova.compute.manager [req-0d09e7ee-c2ea-46c1-ab96-5f0d5a77e27d req-077439ed-062d-44c5-a6e8-b398d29a6393 service nova] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] No waiting events found dispatching network-vif-plugged-8948ed3a-de42-41ef-b7ca-c34a2d405b09 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 04:05:00 user nova-compute[71428]: WARNING nova.compute.manager [req-0d09e7ee-c2ea-46c1-ab96-5f0d5a77e27d req-077439ed-062d-44c5-a6e8-b398d29a6393 service nova] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Received unexpected event network-vif-plugged-8948ed3a-de42-41ef-b7ca-c34a2d405b09 for instance with vm_state deleted and task_state None. Apr 23 04:05:00 user nova-compute[71428]: DEBUG nova.compute.manager [req-0d09e7ee-c2ea-46c1-ab96-5f0d5a77e27d req-077439ed-062d-44c5-a6e8-b398d29a6393 service nova] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Received event network-vif-deleted-8948ed3a-de42-41ef-b7ca-c34a2d405b09 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:05:03 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:05:08 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:05:08 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:05:13 user nova-compute[71428]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 04:05:13 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] VM Stopped (Lifecycle Event) Apr 23 04:05:13 user nova-compute[71428]: DEBUG nova.compute.manager [None req-a8ae97c2-d5d1-4c1b-8509-9e6883d1daba None None] [instance: ee89a9d5-0507-475c-bea2-e02e34d15710] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 04:05:13 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:05:14 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:05:15 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:05:18 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:05:18 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:05:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Acquiring lock "4205fb36-0df5-4c11-b468-89f59d97352c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:05:18 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lock "4205fb36-0df5-4c11-b468-89f59d97352c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:05:19 user nova-compute[71428]: DEBUG nova.compute.manager [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Starting instance... {{(pid=71428) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 23 04:05:19 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:05:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:05:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:05:19 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71428) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 23 04:05:19 user nova-compute[71428]: INFO nova.compute.claims [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Claim successful on node user Apr 23 04:05:19 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 04:05:19 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 04:05:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.203s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:05:19 user nova-compute[71428]: DEBUG nova.compute.manager [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Start building networks asynchronously for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 23 04:05:19 user nova-compute[71428]: DEBUG nova.compute.manager [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Allocating IP information in the background. {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 23 04:05:19 user nova-compute[71428]: DEBUG nova.network.neutron [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] allocate_for_instance() {{(pid=71428) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 23 04:05:19 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 23 04:05:19 user nova-compute[71428]: DEBUG nova.compute.manager [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Start building block device mappings for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 23 04:05:19 user nova-compute[71428]: DEBUG nova.compute.manager [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Start spawning the instance on the hypervisor. {{(pid=71428) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 23 04:05:19 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Creating instance directory {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 23 04:05:19 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Creating image(s) Apr 23 04:05:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Acquiring lock "/opt/stack/data/nova/instances/4205fb36-0df5-4c11-b468-89f59d97352c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:05:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lock "/opt/stack/data/nova/instances/4205fb36-0df5-4c11-b468-89f59d97352c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:05:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lock "/opt/stack/data/nova/instances/4205fb36-0df5-4c11-b468-89f59d97352c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:05:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:05:19 user nova-compute[71428]: DEBUG nova.policy [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '929f88dec4234641a37fdda799108cf2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7d84d2e6776c4ac38b90b752a36600c3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71428) authorize /opt/stack/nova/nova/policy.py:203}} Apr 23 04:05:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.135s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:05:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Acquiring lock "935dc3474dddbde110531a31510941caddc0ae83" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:05:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:05:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:05:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.133s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:05:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/4205fb36-0df5-4c11-b468-89f59d97352c/disk 1073741824 {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:05:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/4205fb36-0df5-4c11-b468-89f59d97352c/disk 1073741824" returned: 0 in 0.047s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:05:19 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.190s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:05:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:05:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.153s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:05:19 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Checking if we can resize image /opt/stack/data/nova/instances/4205fb36-0df5-4c11-b468-89f59d97352c/disk. size=1073741824 {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 23 04:05:19 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4205fb36-0df5-4c11-b468-89f59d97352c/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:05:20 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4205fb36-0df5-4c11-b468-89f59d97352c/disk --force-share --output=json" returned: 0 in 0.128s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:05:20 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Cannot resize image /opt/stack/data/nova/instances/4205fb36-0df5-4c11-b468-89f59d97352c/disk to a smaller size. {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 23 04:05:20 user nova-compute[71428]: DEBUG nova.objects.instance [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lazy-loading 'migration_context' on Instance uuid 4205fb36-0df5-4c11-b468-89f59d97352c {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 04:05:20 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Created local disks {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 23 04:05:20 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Ensure instance console log exists: /opt/stack/data/nova/instances/4205fb36-0df5-4c11-b468-89f59d97352c/console.log {{(pid=71428) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 23 04:05:20 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:05:20 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:05:20 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:05:20 user nova-compute[71428]: DEBUG nova.network.neutron [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Successfully created port: bb5db91e-75d7-4a5c-90c4-66035e0f7628 {{(pid=71428) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 23 04:05:20 user nova-compute[71428]: DEBUG nova.network.neutron [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Successfully updated port: bb5db91e-75d7-4a5c-90c4-66035e0f7628 {{(pid=71428) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 23 04:05:20 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Acquiring lock "refresh_cache-4205fb36-0df5-4c11-b468-89f59d97352c" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 04:05:20 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Acquired lock "refresh_cache-4205fb36-0df5-4c11-b468-89f59d97352c" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 04:05:20 user nova-compute[71428]: DEBUG nova.network.neutron [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Building network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 23 04:05:20 user nova-compute[71428]: DEBUG nova.compute.manager [req-385cc9c5-cbbe-4aa0-b171-86daa36f002b req-c97ab7c9-7d66-42e7-84fc-004809cf1622 service nova] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Received event network-changed-bb5db91e-75d7-4a5c-90c4-66035e0f7628 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:05:20 user nova-compute[71428]: DEBUG nova.compute.manager [req-385cc9c5-cbbe-4aa0-b171-86daa36f002b req-c97ab7c9-7d66-42e7-84fc-004809cf1622 service nova] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Refreshing instance network info cache due to event network-changed-bb5db91e-75d7-4a5c-90c4-66035e0f7628. {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 23 04:05:20 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-385cc9c5-cbbe-4aa0-b171-86daa36f002b req-c97ab7c9-7d66-42e7-84fc-004809cf1622 service nova] Acquiring lock "refresh_cache-4205fb36-0df5-4c11-b468-89f59d97352c" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 04:05:20 user nova-compute[71428]: DEBUG nova.network.neutron [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Instance cache missing network info. {{(pid=71428) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG nova.network.neutron [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Updating instance_info_cache with network_info: [{"id": "bb5db91e-75d7-4a5c-90c4-66035e0f7628", "address": "fa:16:3e:96:1f:e7", "network": {"id": "cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1309461192-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7d84d2e6776c4ac38b90b752a36600c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb5db91e-75", "ovs_interfaceid": "bb5db91e-75d7-4a5c-90c4-66035e0f7628", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Releasing lock "refresh_cache-4205fb36-0df5-4c11-b468-89f59d97352c" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG nova.compute.manager [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Instance network_info: |[{"id": "bb5db91e-75d7-4a5c-90c4-66035e0f7628", "address": "fa:16:3e:96:1f:e7", "network": {"id": "cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1309461192-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7d84d2e6776c4ac38b90b752a36600c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb5db91e-75", "ovs_interfaceid": "bb5db91e-75d7-4a5c-90c4-66035e0f7628", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-385cc9c5-cbbe-4aa0-b171-86daa36f002b req-c97ab7c9-7d66-42e7-84fc-004809cf1622 service nova] Acquired lock "refresh_cache-4205fb36-0df5-4c11-b468-89f59d97352c" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG nova.network.neutron [req-385cc9c5-cbbe-4aa0-b171-86daa36f002b req-c97ab7c9-7d66-42e7-84fc-004809cf1622 service nova] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Refreshing network info cache for port bb5db91e-75d7-4a5c-90c4-66035e0f7628 {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Start _get_guest_xml network_info=[{"id": "bb5db91e-75d7-4a5c-90c4-66035e0f7628", "address": "fa:16:3e:96:1f:e7", "network": {"id": "cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1309461192-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7d84d2e6776c4ac38b90b752a36600c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb5db91e-75", "ovs_interfaceid": "bb5db91e-75d7-4a5c-90c4-66035e0f7628", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'image_id': 'e6127373-9931-4277-9458-eceef653ea1e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 23 04:05:21 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 04:05:21 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 04:05:21 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71428) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-23T03:43:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=), allow threads: True {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Flavor limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Image limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Flavor pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Image pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Got 1 possible topologies {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T04:05:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1244076250',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-1244076250',id=23,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBByIgBCVO1TwHgSgzcCgt56xToj40uPOeHYofZEu1bJvNmhrfUCOYMcMRL/mO4ojjhrAB0fmobOyBQIrSmCW/jGTnyXoK8G1fv5rap39s97Qi9SpKbhu+T8NGRCCInZWaw==',key_name='tempest-keypair-679082328',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7d84d2e6776c4ac38b90b752a36600c3',ramdisk_id='',reservation_id='r-tmf2omnf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-1622749322',owner_user_name='tempest-AttachVolumeShelveTestJSON-1622749322-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T04:05:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='929f88dec4234641a37fdda799108cf2',uuid=4205fb36-0df5-4c11-b468-89f59d97352c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bb5db91e-75d7-4a5c-90c4-66035e0f7628", "address": "fa:16:3e:96:1f:e7", "network": {"id": "cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1309461192-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7d84d2e6776c4ac38b90b752a36600c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb5db91e-75", "ovs_interfaceid": "bb5db91e-75d7-4a5c-90c4-66035e0f7628", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71428) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Converting VIF {"id": "bb5db91e-75d7-4a5c-90c4-66035e0f7628", "address": "fa:16:3e:96:1f:e7", "network": {"id": "cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1309461192-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7d84d2e6776c4ac38b90b752a36600c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb5db91e-75", "ovs_interfaceid": "bb5db91e-75d7-4a5c-90c4-66035e0f7628", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:96:1f:e7,bridge_name='br-int',has_traffic_filtering=True,id=bb5db91e-75d7-4a5c-90c4-66035e0f7628,network=Network(cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb5db91e-75') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG nova.objects.instance [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lazy-loading 'pci_devices' on Instance uuid 4205fb36-0df5-4c11-b468-89f59d97352c {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] End _get_guest_xml xml= Apr 23 04:05:21 user nova-compute[71428]: 4205fb36-0df5-4c11-b468-89f59d97352c Apr 23 04:05:21 user nova-compute[71428]: instance-00000017 Apr 23 04:05:21 user nova-compute[71428]: 131072 Apr 23 04:05:21 user nova-compute[71428]: 1 Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: tempest-AttachVolumeShelveTestJSON-server-1244076250 Apr 23 04:05:21 user nova-compute[71428]: 2023-04-23 04:05:21 Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: 128 Apr 23 04:05:21 user nova-compute[71428]: 1 Apr 23 04:05:21 user nova-compute[71428]: 0 Apr 23 04:05:21 user nova-compute[71428]: 0 Apr 23 04:05:21 user nova-compute[71428]: 1 Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: tempest-AttachVolumeShelveTestJSON-1622749322-project-member Apr 23 04:05:21 user nova-compute[71428]: tempest-AttachVolumeShelveTestJSON-1622749322 Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: OpenStack Foundation Apr 23 04:05:21 user nova-compute[71428]: OpenStack Nova Apr 23 04:05:21 user nova-compute[71428]: 0.0.0 Apr 23 04:05:21 user nova-compute[71428]: 4205fb36-0df5-4c11-b468-89f59d97352c Apr 23 04:05:21 user nova-compute[71428]: 4205fb36-0df5-4c11-b468-89f59d97352c Apr 23 04:05:21 user nova-compute[71428]: Virtual Machine Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: hvm Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Nehalem Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: /dev/urandom Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: Apr 23 04:05:21 user nova-compute[71428]: {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T04:05:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1244076250',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-1244076250',id=23,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBByIgBCVO1TwHgSgzcCgt56xToj40uPOeHYofZEu1bJvNmhrfUCOYMcMRL/mO4ojjhrAB0fmobOyBQIrSmCW/jGTnyXoK8G1fv5rap39s97Qi9SpKbhu+T8NGRCCInZWaw==',key_name='tempest-keypair-679082328',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7d84d2e6776c4ac38b90b752a36600c3',ramdisk_id='',reservation_id='r-tmf2omnf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-1622749322',owner_user_name='tempest-AttachVolumeShelveTestJSON-1622749322-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T04:05:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='929f88dec4234641a37fdda799108cf2',uuid=4205fb36-0df5-4c11-b468-89f59d97352c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bb5db91e-75d7-4a5c-90c4-66035e0f7628", "address": "fa:16:3e:96:1f:e7", "network": {"id": "cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1309461192-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7d84d2e6776c4ac38b90b752a36600c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb5db91e-75", "ovs_interfaceid": "bb5db91e-75d7-4a5c-90c4-66035e0f7628", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Converting VIF {"id": "bb5db91e-75d7-4a5c-90c4-66035e0f7628", "address": "fa:16:3e:96:1f:e7", "network": {"id": "cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1309461192-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7d84d2e6776c4ac38b90b752a36600c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb5db91e-75", "ovs_interfaceid": "bb5db91e-75d7-4a5c-90c4-66035e0f7628", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:96:1f:e7,bridge_name='br-int',has_traffic_filtering=True,id=bb5db91e-75d7-4a5c-90c4-66035e0f7628,network=Network(cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb5db91e-75') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG os_vif [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:96:1f:e7,bridge_name='br-int',has_traffic_filtering=True,id=bb5db91e-75d7-4a5c-90c4-66035e0f7628,network=Network(cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb5db91e-75') {{(pid=71428) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbb5db91e-75, may_exist=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbb5db91e-75, col_values=(('external_ids', {'iface-id': 'bb5db91e-75d7-4a5c-90c4-66035e0f7628', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:96:1f:e7', 'vm-uuid': '4205fb36-0df5-4c11-b468-89f59d97352c'}),)) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:05:21 user nova-compute[71428]: INFO os_vif [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:96:1f:e7,bridge_name='br-int',has_traffic_filtering=True,id=bb5db91e-75d7-4a5c-90c4-66035e0f7628,network=Network(cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb5db91e-75') Apr 23 04:05:21 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] No BDM found with device name vda, not building metadata. {{(pid=71428) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] No VIF found with MAC fa:16:3e:96:1f:e7, not building metadata {{(pid=71428) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG nova.network.neutron [req-385cc9c5-cbbe-4aa0-b171-86daa36f002b req-c97ab7c9-7d66-42e7-84fc-004809cf1622 service nova] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Updated VIF entry in instance network info cache for port bb5db91e-75d7-4a5c-90c4-66035e0f7628. {{(pid=71428) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG nova.network.neutron [req-385cc9c5-cbbe-4aa0-b171-86daa36f002b req-c97ab7c9-7d66-42e7-84fc-004809cf1622 service nova] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Updating instance_info_cache with network_info: [{"id": "bb5db91e-75d7-4a5c-90c4-66035e0f7628", "address": "fa:16:3e:96:1f:e7", "network": {"id": "cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1309461192-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7d84d2e6776c4ac38b90b752a36600c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb5db91e-75", "ovs_interfaceid": "bb5db91e-75d7-4a5c-90c4-66035e0f7628", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 04:05:21 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-385cc9c5-cbbe-4aa0-b171-86daa36f002b req-c97ab7c9-7d66-42e7-84fc-004809cf1622 service nova] Releasing lock "refresh_cache-4205fb36-0df5-4c11-b468-89f59d97352c" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 04:05:22 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:05:22 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71428) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 23 04:05:22 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:05:22 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:05:22 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:05:22 user nova-compute[71428]: DEBUG nova.compute.manager [req-1816177f-9d84-407b-abd1-57c337b93f17 req-e84002a7-2ab0-4b70-b5e0-7a8fd482a6b1 service nova] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Received event network-vif-plugged-bb5db91e-75d7-4a5c-90c4-66035e0f7628 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:05:22 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-1816177f-9d84-407b-abd1-57c337b93f17 req-e84002a7-2ab0-4b70-b5e0-7a8fd482a6b1 service nova] Acquiring lock "4205fb36-0df5-4c11-b468-89f59d97352c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:05:22 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-1816177f-9d84-407b-abd1-57c337b93f17 req-e84002a7-2ab0-4b70-b5e0-7a8fd482a6b1 service nova] Lock "4205fb36-0df5-4c11-b468-89f59d97352c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:05:22 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-1816177f-9d84-407b-abd1-57c337b93f17 req-e84002a7-2ab0-4b70-b5e0-7a8fd482a6b1 service nova] Lock "4205fb36-0df5-4c11-b468-89f59d97352c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:05:22 user nova-compute[71428]: DEBUG nova.compute.manager [req-1816177f-9d84-407b-abd1-57c337b93f17 req-e84002a7-2ab0-4b70-b5e0-7a8fd482a6b1 service nova] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] No waiting events found dispatching network-vif-plugged-bb5db91e-75d7-4a5c-90c4-66035e0f7628 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 04:05:22 user nova-compute[71428]: WARNING nova.compute.manager [req-1816177f-9d84-407b-abd1-57c337b93f17 req-e84002a7-2ab0-4b70-b5e0-7a8fd482a6b1 service nova] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Received unexpected event network-vif-plugged-bb5db91e-75d7-4a5c-90c4-66035e0f7628 for instance with vm_state building and task_state spawning. Apr 23 04:05:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:05:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:05:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:05:23 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:05:24 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:05:24 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:05:24 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Resumed> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 04:05:24 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] VM Resumed (Lifecycle Event) Apr 23 04:05:24 user nova-compute[71428]: DEBUG nova.compute.manager [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Instance event wait completed in 0 seconds for {{(pid=71428) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 23 04:05:24 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Guest created on hypervisor {{(pid=71428) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 23 04:05:24 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Instance spawned successfully. Apr 23 04:05:24 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 23 04:05:24 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 04:05:24 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 04:05:24 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Found default for hw_cdrom_bus of ide {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 04:05:24 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Found default for hw_disk_bus of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 04:05:24 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Found default for hw_input_bus of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 04:05:24 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Found default for hw_pointer_model of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 04:05:24 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Found default for hw_video_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 04:05:24 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Found default for hw_vif_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 04:05:24 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 04:05:24 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Started> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 04:05:24 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] VM Started (Lifecycle Event) Apr 23 04:05:24 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 04:05:24 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 04:05:24 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 04:05:24 user nova-compute[71428]: INFO nova.compute.manager [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Took 5.43 seconds to spawn the instance on the hypervisor. Apr 23 04:05:24 user nova-compute[71428]: DEBUG nova.compute.manager [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 04:05:24 user nova-compute[71428]: DEBUG nova.compute.manager [req-8d103cd0-3f5d-43e0-8e7d-4acfedbd5817 req-7d131158-9368-42ac-8806-1193a5d86fa1 service nova] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Received event network-vif-plugged-bb5db91e-75d7-4a5c-90c4-66035e0f7628 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:05:24 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-8d103cd0-3f5d-43e0-8e7d-4acfedbd5817 req-7d131158-9368-42ac-8806-1193a5d86fa1 service nova] Acquiring lock "4205fb36-0df5-4c11-b468-89f59d97352c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:05:24 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-8d103cd0-3f5d-43e0-8e7d-4acfedbd5817 req-7d131158-9368-42ac-8806-1193a5d86fa1 service nova] Lock "4205fb36-0df5-4c11-b468-89f59d97352c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:05:24 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-8d103cd0-3f5d-43e0-8e7d-4acfedbd5817 req-7d131158-9368-42ac-8806-1193a5d86fa1 service nova] Lock "4205fb36-0df5-4c11-b468-89f59d97352c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:05:24 user nova-compute[71428]: DEBUG nova.compute.manager [req-8d103cd0-3f5d-43e0-8e7d-4acfedbd5817 req-7d131158-9368-42ac-8806-1193a5d86fa1 service nova] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] No waiting events found dispatching network-vif-plugged-bb5db91e-75d7-4a5c-90c4-66035e0f7628 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 04:05:24 user nova-compute[71428]: WARNING nova.compute.manager [req-8d103cd0-3f5d-43e0-8e7d-4acfedbd5817 req-7d131158-9368-42ac-8806-1193a5d86fa1 service nova] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Received unexpected event network-vif-plugged-bb5db91e-75d7-4a5c-90c4-66035e0f7628 for instance with vm_state building and task_state spawning. Apr 23 04:05:25 user nova-compute[71428]: INFO nova.compute.manager [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Took 5.96 seconds to build instance. Apr 23 04:05:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-3cd718e2-feaf-490d-b950-ce564388c25a tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lock "4205fb36-0df5-4c11-b468-89f59d97352c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.062s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:05:25 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:05:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:05:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:05:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:05:25 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Auditing locally available compute resources for user (node: user) {{(pid=71428) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 23 04:05:25 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4205fb36-0df5-4c11-b468-89f59d97352c/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:05:25 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4205fb36-0df5-4c11-b468-89f59d97352c/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:05:25 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4205fb36-0df5-4c11-b468-89f59d97352c/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:05:26 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4205fb36-0df5-4c11-b468-89f59d97352c/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:05:26 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:05:26 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 04:05:26 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 04:05:26 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Hypervisor/Node resource view: name=user free_ram=9057MB free_disk=26.283214569091797GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71428) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 23 04:05:26 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:05:26 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:05:26 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 4205fb36-0df5-4c11-b468-89f59d97352c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 04:05:26 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 23 04:05:26 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 23 04:05:26 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 04:05:26 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 04:05:26 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Compute_service record updated for user:user {{(pid=71428) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 23 04:05:26 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.192s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:05:28 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:05:28 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Starting heal instance info cache {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 23 04:05:28 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Rebuilding the list of instances to heal {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 23 04:05:28 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "refresh_cache-4205fb36-0df5-4c11-b468-89f59d97352c" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 04:05:28 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquired lock "refresh_cache-4205fb36-0df5-4c11-b468-89f59d97352c" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 04:05:28 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Forcefully refreshing network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 23 04:05:28 user nova-compute[71428]: DEBUG nova.objects.instance [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lazy-loading 'info_cache' on Instance uuid 4205fb36-0df5-4c11-b468-89f59d97352c {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 04:05:29 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:05:29 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Updating instance_info_cache with network_info: [{"id": "bb5db91e-75d7-4a5c-90c4-66035e0f7628", "address": "fa:16:3e:96:1f:e7", "network": {"id": "cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1309461192-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7d84d2e6776c4ac38b90b752a36600c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb5db91e-75", "ovs_interfaceid": "bb5db91e-75d7-4a5c-90c4-66035e0f7628", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 04:05:29 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Releasing lock "refresh_cache-4205fb36-0df5-4c11-b468-89f59d97352c" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 04:05:29 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Updated the network info_cache for instance {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 23 04:05:30 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:05:31 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:05:31 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:05:32 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:05:36 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:05:39 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:05:41 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:05:44 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:05:46 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:05:49 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:05:50 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:05:51 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:05:54 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:05:56 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:06:01 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:06:02 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:06:06 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:06:09 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:06:10 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:06:11 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:06:12 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:06:14 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:06:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:06:16 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:06:18 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:06:19 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:06:19 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:06:21 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:06:22 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Acquiring lock "8171e321-666d-44fc-a0ef-b297ec22369b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:06:22 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Lock "8171e321-666d-44fc-a0ef-b297ec22369b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:06:22 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:06:22 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71428) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 23 04:06:22 user nova-compute[71428]: DEBUG nova.compute.manager [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Starting instance... {{(pid=71428) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 23 04:06:22 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:06:22 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.004s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:06:22 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71428) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 23 04:06:22 user nova-compute[71428]: INFO nova.compute.claims [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Claim successful on node user Apr 23 04:06:22 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 04:06:22 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 04:06:22 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.216s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:06:22 user nova-compute[71428]: DEBUG nova.compute.manager [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Start building networks asynchronously for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 23 04:06:23 user nova-compute[71428]: DEBUG nova.compute.manager [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Allocating IP information in the background. {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 23 04:06:23 user nova-compute[71428]: DEBUG nova.network.neutron [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] allocate_for_instance() {{(pid=71428) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 23 04:06:23 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 23 04:06:23 user nova-compute[71428]: DEBUG nova.compute.manager [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Start building block device mappings for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 23 04:06:23 user nova-compute[71428]: DEBUG nova.policy [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c452706dd7dc4f54b45e9f959288bd6b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f7eb2f69eab54dd8aac3e6cca9d5e46a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71428) authorize /opt/stack/nova/nova/policy.py:203}} Apr 23 04:06:23 user nova-compute[71428]: DEBUG nova.compute.manager [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Start spawning the instance on the hypervisor. {{(pid=71428) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 23 04:06:23 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Creating instance directory {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 23 04:06:23 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Creating image(s) Apr 23 04:06:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Acquiring lock "/opt/stack/data/nova/instances/8171e321-666d-44fc-a0ef-b297ec22369b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:06:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Lock "/opt/stack/data/nova/instances/8171e321-666d-44fc-a0ef-b297ec22369b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:06:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Lock "/opt/stack/data/nova/instances/8171e321-666d-44fc-a0ef-b297ec22369b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:06:23 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:06:23 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.134s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:06:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Acquiring lock "935dc3474dddbde110531a31510941caddc0ae83" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:06:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:06:23 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:06:23 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.132s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:06:23 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/8171e321-666d-44fc-a0ef-b297ec22369b/disk 1073741824 {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:06:23 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/8171e321-666d-44fc-a0ef-b297ec22369b/disk 1073741824" returned: 0 in 0.049s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:06:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.188s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:06:23 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:06:23 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.132s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:06:23 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Checking if we can resize image /opt/stack/data/nova/instances/8171e321-666d-44fc-a0ef-b297ec22369b/disk. size=1073741824 {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 23 04:06:23 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8171e321-666d-44fc-a0ef-b297ec22369b/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:06:23 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8171e321-666d-44fc-a0ef-b297ec22369b/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:06:23 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Cannot resize image /opt/stack/data/nova/instances/8171e321-666d-44fc-a0ef-b297ec22369b/disk to a smaller size. {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 23 04:06:23 user nova-compute[71428]: DEBUG nova.objects.instance [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Lazy-loading 'migration_context' on Instance uuid 8171e321-666d-44fc-a0ef-b297ec22369b {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 04:06:23 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Created local disks {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 23 04:06:23 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Ensure instance console log exists: /opt/stack/data/nova/instances/8171e321-666d-44fc-a0ef-b297ec22369b/console.log {{(pid=71428) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 23 04:06:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:06:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:06:23 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:06:23 user nova-compute[71428]: DEBUG nova.network.neutron [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Successfully created port: a1a195c9-37d9-45f8-b8dd-989b54c8f2dd {{(pid=71428) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG nova.network.neutron [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Successfully updated port: a1a195c9-37d9-45f8-b8dd-989b54c8f2dd {{(pid=71428) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Acquiring lock "refresh_cache-8171e321-666d-44fc-a0ef-b297ec22369b" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Acquired lock "refresh_cache-8171e321-666d-44fc-a0ef-b297ec22369b" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG nova.network.neutron [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Building network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG nova.compute.manager [req-63b57f31-5c4b-40b0-ae2d-4af030b5a182 req-c035796d-0f2c-444a-b71f-2d7edc9fe362 service nova] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Received event network-changed-a1a195c9-37d9-45f8-b8dd-989b54c8f2dd {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG nova.compute.manager [req-63b57f31-5c4b-40b0-ae2d-4af030b5a182 req-c035796d-0f2c-444a-b71f-2d7edc9fe362 service nova] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Refreshing instance network info cache due to event network-changed-a1a195c9-37d9-45f8-b8dd-989b54c8f2dd. {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-63b57f31-5c4b-40b0-ae2d-4af030b5a182 req-c035796d-0f2c-444a-b71f-2d7edc9fe362 service nova] Acquiring lock "refresh_cache-8171e321-666d-44fc-a0ef-b297ec22369b" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG nova.network.neutron [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Instance cache missing network info. {{(pid=71428) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG nova.network.neutron [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Updating instance_info_cache with network_info: [{"id": "a1a195c9-37d9-45f8-b8dd-989b54c8f2dd", "address": "fa:16:3e:d2:e0:f1", "network": {"id": "82debf07-b0f5-4ec4-95e8-bd8a03b0cdb9", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-890753090-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f7eb2f69eab54dd8aac3e6cca9d5e46a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1a195c9-37", "ovs_interfaceid": "a1a195c9-37d9-45f8-b8dd-989b54c8f2dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Releasing lock "refresh_cache-8171e321-666d-44fc-a0ef-b297ec22369b" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG nova.compute.manager [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Instance network_info: |[{"id": "a1a195c9-37d9-45f8-b8dd-989b54c8f2dd", "address": "fa:16:3e:d2:e0:f1", "network": {"id": "82debf07-b0f5-4ec4-95e8-bd8a03b0cdb9", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-890753090-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f7eb2f69eab54dd8aac3e6cca9d5e46a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1a195c9-37", "ovs_interfaceid": "a1a195c9-37d9-45f8-b8dd-989b54c8f2dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-63b57f31-5c4b-40b0-ae2d-4af030b5a182 req-c035796d-0f2c-444a-b71f-2d7edc9fe362 service nova] Acquired lock "refresh_cache-8171e321-666d-44fc-a0ef-b297ec22369b" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG nova.network.neutron [req-63b57f31-5c4b-40b0-ae2d-4af030b5a182 req-c035796d-0f2c-444a-b71f-2d7edc9fe362 service nova] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Refreshing network info cache for port a1a195c9-37d9-45f8-b8dd-989b54c8f2dd {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Start _get_guest_xml network_info=[{"id": "a1a195c9-37d9-45f8-b8dd-989b54c8f2dd", "address": "fa:16:3e:d2:e0:f1", "network": {"id": "82debf07-b0f5-4ec4-95e8-bd8a03b0cdb9", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-890753090-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f7eb2f69eab54dd8aac3e6cca9d5e46a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1a195c9-37", "ovs_interfaceid": "a1a195c9-37d9-45f8-b8dd-989b54c8f2dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'image_id': 'e6127373-9931-4277-9458-eceef653ea1e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 23 04:06:24 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 04:06:24 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 04:06:24 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71428) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-23T03:43:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=), allow threads: True {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Flavor limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Image limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Flavor pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Image pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Got 1 possible topologies {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T04:06:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SnapshotDataIntegrityTests-server-1589353047',display_name='tempest-SnapshotDataIntegrityTests-server-1589353047',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-snapshotdataintegritytests-server-1589353047',id=24,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAu60RPBRY8ixX0DVPXyTC0obHZtMSQn+/eQe4CX41J6ZGl9/35/LZ6Mg27oUkmRdsopV1StlsbxkgFLQHMAYMn979AL7gketvktWqIxDsED8DUOxN69qCEepNW76Mz/YA==',key_name='tempest-SnapshotDataIntegrityTests-739066652',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f7eb2f69eab54dd8aac3e6cca9d5e46a',ramdisk_id='',reservation_id='r-1jl7tgfp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-SnapshotDataIntegrityTests-64123451',owner_user_name='tempest-SnapshotDataIntegrityTests-64123451-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T04:06:23Z,user_data=None,user_id='c452706dd7dc4f54b45e9f959288bd6b',uuid=8171e321-666d-44fc-a0ef-b297ec22369b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a1a195c9-37d9-45f8-b8dd-989b54c8f2dd", "address": "fa:16:3e:d2:e0:f1", "network": {"id": "82debf07-b0f5-4ec4-95e8-bd8a03b0cdb9", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-890753090-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f7eb2f69eab54dd8aac3e6cca9d5e46a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1a195c9-37", "ovs_interfaceid": "a1a195c9-37d9-45f8-b8dd-989b54c8f2dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71428) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Converting VIF {"id": "a1a195c9-37d9-45f8-b8dd-989b54c8f2dd", "address": "fa:16:3e:d2:e0:f1", "network": {"id": "82debf07-b0f5-4ec4-95e8-bd8a03b0cdb9", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-890753090-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f7eb2f69eab54dd8aac3e6cca9d5e46a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1a195c9-37", "ovs_interfaceid": "a1a195c9-37d9-45f8-b8dd-989b54c8f2dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d2:e0:f1,bridge_name='br-int',has_traffic_filtering=True,id=a1a195c9-37d9-45f8-b8dd-989b54c8f2dd,network=Network(82debf07-b0f5-4ec4-95e8-bd8a03b0cdb9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa1a195c9-37') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG nova.objects.instance [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Lazy-loading 'pci_devices' on Instance uuid 8171e321-666d-44fc-a0ef-b297ec22369b {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] End _get_guest_xml xml= Apr 23 04:06:24 user nova-compute[71428]: 8171e321-666d-44fc-a0ef-b297ec22369b Apr 23 04:06:24 user nova-compute[71428]: instance-00000018 Apr 23 04:06:24 user nova-compute[71428]: 131072 Apr 23 04:06:24 user nova-compute[71428]: 1 Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: tempest-SnapshotDataIntegrityTests-server-1589353047 Apr 23 04:06:24 user nova-compute[71428]: 2023-04-23 04:06:24 Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: 128 Apr 23 04:06:24 user nova-compute[71428]: 1 Apr 23 04:06:24 user nova-compute[71428]: 0 Apr 23 04:06:24 user nova-compute[71428]: 0 Apr 23 04:06:24 user nova-compute[71428]: 1 Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: tempest-SnapshotDataIntegrityTests-64123451-project-member Apr 23 04:06:24 user nova-compute[71428]: tempest-SnapshotDataIntegrityTests-64123451 Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: OpenStack Foundation Apr 23 04:06:24 user nova-compute[71428]: OpenStack Nova Apr 23 04:06:24 user nova-compute[71428]: 0.0.0 Apr 23 04:06:24 user nova-compute[71428]: 8171e321-666d-44fc-a0ef-b297ec22369b Apr 23 04:06:24 user nova-compute[71428]: 8171e321-666d-44fc-a0ef-b297ec22369b Apr 23 04:06:24 user nova-compute[71428]: Virtual Machine Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: hvm Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Nehalem Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: /dev/urandom Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: Apr 23 04:06:24 user nova-compute[71428]: {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T04:06:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SnapshotDataIntegrityTests-server-1589353047',display_name='tempest-SnapshotDataIntegrityTests-server-1589353047',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-snapshotdataintegritytests-server-1589353047',id=24,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAu60RPBRY8ixX0DVPXyTC0obHZtMSQn+/eQe4CX41J6ZGl9/35/LZ6Mg27oUkmRdsopV1StlsbxkgFLQHMAYMn979AL7gketvktWqIxDsED8DUOxN69qCEepNW76Mz/YA==',key_name='tempest-SnapshotDataIntegrityTests-739066652',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f7eb2f69eab54dd8aac3e6cca9d5e46a',ramdisk_id='',reservation_id='r-1jl7tgfp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-SnapshotDataIntegrityTests-64123451',owner_user_name='tempest-SnapshotDataIntegrityTests-64123451-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T04:06:23Z,user_data=None,user_id='c452706dd7dc4f54b45e9f959288bd6b',uuid=8171e321-666d-44fc-a0ef-b297ec22369b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a1a195c9-37d9-45f8-b8dd-989b54c8f2dd", "address": "fa:16:3e:d2:e0:f1", "network": {"id": "82debf07-b0f5-4ec4-95e8-bd8a03b0cdb9", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-890753090-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f7eb2f69eab54dd8aac3e6cca9d5e46a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1a195c9-37", "ovs_interfaceid": "a1a195c9-37d9-45f8-b8dd-989b54c8f2dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Converting VIF {"id": "a1a195c9-37d9-45f8-b8dd-989b54c8f2dd", "address": "fa:16:3e:d2:e0:f1", "network": {"id": "82debf07-b0f5-4ec4-95e8-bd8a03b0cdb9", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-890753090-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f7eb2f69eab54dd8aac3e6cca9d5e46a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1a195c9-37", "ovs_interfaceid": "a1a195c9-37d9-45f8-b8dd-989b54c8f2dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d2:e0:f1,bridge_name='br-int',has_traffic_filtering=True,id=a1a195c9-37d9-45f8-b8dd-989b54c8f2dd,network=Network(82debf07-b0f5-4ec4-95e8-bd8a03b0cdb9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa1a195c9-37') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG os_vif [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d2:e0:f1,bridge_name='br-int',has_traffic_filtering=True,id=a1a195c9-37d9-45f8-b8dd-989b54c8f2dd,network=Network(82debf07-b0f5-4ec4-95e8-bd8a03b0cdb9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa1a195c9-37') {{(pid=71428) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa1a195c9-37, may_exist=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa1a195c9-37, col_values=(('external_ids', {'iface-id': 'a1a195c9-37d9-45f8-b8dd-989b54c8f2dd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d2:e0:f1', 'vm-uuid': '8171e321-666d-44fc-a0ef-b297ec22369b'}),)) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:06:24 user nova-compute[71428]: INFO os_vif [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d2:e0:f1,bridge_name='br-int',has_traffic_filtering=True,id=a1a195c9-37d9-45f8-b8dd-989b54c8f2dd,network=Network(82debf07-b0f5-4ec4-95e8-bd8a03b0cdb9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa1a195c9-37') Apr 23 04:06:24 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] No BDM found with device name vda, not building metadata. {{(pid=71428) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 23 04:06:24 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] No VIF found with MAC fa:16:3e:d2:e0:f1, not building metadata {{(pid=71428) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 23 04:06:25 user nova-compute[71428]: DEBUG nova.network.neutron [req-63b57f31-5c4b-40b0-ae2d-4af030b5a182 req-c035796d-0f2c-444a-b71f-2d7edc9fe362 service nova] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Updated VIF entry in instance network info cache for port a1a195c9-37d9-45f8-b8dd-989b54c8f2dd. {{(pid=71428) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 23 04:06:25 user nova-compute[71428]: DEBUG nova.network.neutron [req-63b57f31-5c4b-40b0-ae2d-4af030b5a182 req-c035796d-0f2c-444a-b71f-2d7edc9fe362 service nova] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Updating instance_info_cache with network_info: [{"id": "a1a195c9-37d9-45f8-b8dd-989b54c8f2dd", "address": "fa:16:3e:d2:e0:f1", "network": {"id": "82debf07-b0f5-4ec4-95e8-bd8a03b0cdb9", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-890753090-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f7eb2f69eab54dd8aac3e6cca9d5e46a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1a195c9-37", "ovs_interfaceid": "a1a195c9-37d9-45f8-b8dd-989b54c8f2dd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 04:06:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-63b57f31-5c4b-40b0-ae2d-4af030b5a182 req-c035796d-0f2c-444a-b71f-2d7edc9fe362 service nova] Releasing lock "refresh_cache-8171e321-666d-44fc-a0ef-b297ec22369b" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 04:06:25 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:06:25 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:06:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:06:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:06:25 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:06:25 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Auditing locally available compute resources for user (node: user) {{(pid=71428) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 23 04:06:26 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:06:26 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:06:26 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:06:26 user nova-compute[71428]: DEBUG nova.compute.manager [req-22beb47f-7ee8-476a-bedb-2ed1404771ad req-ea041d0f-f065-4cc2-8081-64ced52a674e service nova] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Received event network-vif-plugged-a1a195c9-37d9-45f8-b8dd-989b54c8f2dd {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:06:26 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-22beb47f-7ee8-476a-bedb-2ed1404771ad req-ea041d0f-f065-4cc2-8081-64ced52a674e service nova] Acquiring lock "8171e321-666d-44fc-a0ef-b297ec22369b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:06:26 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-22beb47f-7ee8-476a-bedb-2ed1404771ad req-ea041d0f-f065-4cc2-8081-64ced52a674e service nova] Lock "8171e321-666d-44fc-a0ef-b297ec22369b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:06:26 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-22beb47f-7ee8-476a-bedb-2ed1404771ad req-ea041d0f-f065-4cc2-8081-64ced52a674e service nova] Lock "8171e321-666d-44fc-a0ef-b297ec22369b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:06:26 user nova-compute[71428]: DEBUG nova.compute.manager [req-22beb47f-7ee8-476a-bedb-2ed1404771ad req-ea041d0f-f065-4cc2-8081-64ced52a674e service nova] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] No waiting events found dispatching network-vif-plugged-a1a195c9-37d9-45f8-b8dd-989b54c8f2dd {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 04:06:26 user nova-compute[71428]: WARNING nova.compute.manager [req-22beb47f-7ee8-476a-bedb-2ed1404771ad req-ea041d0f-f065-4cc2-8081-64ced52a674e service nova] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Received unexpected event network-vif-plugged-a1a195c9-37d9-45f8-b8dd-989b54c8f2dd for instance with vm_state building and task_state spawning. Apr 23 04:06:26 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4205fb36-0df5-4c11-b468-89f59d97352c/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:06:26 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:06:26 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:06:26 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4205fb36-0df5-4c11-b468-89f59d97352c/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:06:26 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4205fb36-0df5-4c11-b468-89f59d97352c/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:06:26 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4205fb36-0df5-4c11-b468-89f59d97352c/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:06:26 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8171e321-666d-44fc-a0ef-b297ec22369b/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:06:27 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8171e321-666d-44fc-a0ef-b297ec22369b/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:06:27 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8171e321-666d-44fc-a0ef-b297ec22369b/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:06:27 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8171e321-666d-44fc-a0ef-b297ec22369b/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:06:27 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 04:06:27 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 04:06:27 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Hypervisor/Node resource view: name=user free_ram=8950MB free_disk=26.252365112304688GB free_vcpus=10 pci_devices=[{"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71428) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 23 04:06:27 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:06:27 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:06:27 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 4205fb36-0df5-4c11-b468-89f59d97352c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 04:06:27 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 8171e321-666d-44fc-a0ef-b297ec22369b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 04:06:27 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Total usable vcpus: 12, total allocated vcpus: 2 {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 23 04:06:27 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Final resource view: name=user phys_ram=16023MB used_ram=768MB phys_disk=40GB used_disk=2GB total_vcpus=12 used_vcpus=2 pci_stats=[] {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 23 04:06:27 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 04:06:27 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 04:06:28 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Compute_service record updated for user:user {{(pid=71428) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 23 04:06:28 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.353s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:06:28 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Resumed> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 04:06:28 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] VM Resumed (Lifecycle Event) Apr 23 04:06:28 user nova-compute[71428]: DEBUG nova.compute.manager [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Instance event wait completed in 0 seconds for {{(pid=71428) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 23 04:06:28 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Guest created on hypervisor {{(pid=71428) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 23 04:06:28 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Instance spawned successfully. Apr 23 04:06:28 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 23 04:06:28 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 04:06:28 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 04:06:28 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Found default for hw_cdrom_bus of ide {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 04:06:28 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Found default for hw_disk_bus of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 04:06:28 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Found default for hw_input_bus of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 04:06:28 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Found default for hw_pointer_model of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 04:06:28 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Found default for hw_video_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 04:06:28 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Found default for hw_vif_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 04:06:28 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 04:06:28 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Started> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 04:06:28 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] VM Started (Lifecycle Event) Apr 23 04:06:28 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 04:06:28 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 04:06:28 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 04:06:28 user nova-compute[71428]: INFO nova.compute.manager [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Took 5.25 seconds to spawn the instance on the hypervisor. Apr 23 04:06:28 user nova-compute[71428]: DEBUG nova.compute.manager [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 04:06:28 user nova-compute[71428]: INFO nova.compute.manager [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Took 5.76 seconds to build instance. Apr 23 04:06:28 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-422dd174-07ea-4231-b040-6a90648552e6 tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Lock "8171e321-666d-44fc-a0ef-b297ec22369b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 5.852s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:06:28 user nova-compute[71428]: DEBUG nova.compute.manager [req-e36f7c45-b457-4ffa-a613-c808c2a54d07 req-2ff195f3-41b0-4f0c-8c6a-206fff931883 service nova] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Received event network-vif-plugged-a1a195c9-37d9-45f8-b8dd-989b54c8f2dd {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:06:28 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-e36f7c45-b457-4ffa-a613-c808c2a54d07 req-2ff195f3-41b0-4f0c-8c6a-206fff931883 service nova] Acquiring lock "8171e321-666d-44fc-a0ef-b297ec22369b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:06:28 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-e36f7c45-b457-4ffa-a613-c808c2a54d07 req-2ff195f3-41b0-4f0c-8c6a-206fff931883 service nova] Lock "8171e321-666d-44fc-a0ef-b297ec22369b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:06:28 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-e36f7c45-b457-4ffa-a613-c808c2a54d07 req-2ff195f3-41b0-4f0c-8c6a-206fff931883 service nova] Lock "8171e321-666d-44fc-a0ef-b297ec22369b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:06:28 user nova-compute[71428]: DEBUG nova.compute.manager [req-e36f7c45-b457-4ffa-a613-c808c2a54d07 req-2ff195f3-41b0-4f0c-8c6a-206fff931883 service nova] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] No waiting events found dispatching network-vif-plugged-a1a195c9-37d9-45f8-b8dd-989b54c8f2dd {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 04:06:28 user nova-compute[71428]: WARNING nova.compute.manager [req-e36f7c45-b457-4ffa-a613-c808c2a54d07 req-2ff195f3-41b0-4f0c-8c6a-206fff931883 service nova] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Received unexpected event network-vif-plugged-a1a195c9-37d9-45f8-b8dd-989b54c8f2dd for instance with vm_state active and task_state None. Apr 23 04:06:29 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:06:29 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:06:32 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:06:32 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Starting heal instance info cache {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 23 04:06:32 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Rebuilding the list of instances to heal {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 23 04:06:32 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "refresh_cache-4205fb36-0df5-4c11-b468-89f59d97352c" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 04:06:32 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquired lock "refresh_cache-4205fb36-0df5-4c11-b468-89f59d97352c" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 04:06:32 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Forcefully refreshing network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 23 04:06:32 user nova-compute[71428]: DEBUG nova.objects.instance [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lazy-loading 'info_cache' on Instance uuid 4205fb36-0df5-4c11-b468-89f59d97352c {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 04:06:32 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Updating instance_info_cache with network_info: [{"id": "bb5db91e-75d7-4a5c-90c4-66035e0f7628", "address": "fa:16:3e:96:1f:e7", "network": {"id": "cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1309461192-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7d84d2e6776c4ac38b90b752a36600c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb5db91e-75", "ovs_interfaceid": "bb5db91e-75d7-4a5c-90c4-66035e0f7628", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 04:06:32 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Releasing lock "refresh_cache-4205fb36-0df5-4c11-b468-89f59d97352c" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 04:06:32 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Updated the network info_cache for instance {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 23 04:06:32 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:06:32 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:06:34 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:06:34 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:06:34 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:06:35 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:06:39 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:06:44 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:06:44 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:06:49 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:06:49 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:06:54 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:06:54 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:06:59 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:06:59 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:06:59 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71428) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 23 04:06:59 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 04:06:59 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 04:06:59 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:07:04 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:07:09 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:07:09 user nova-compute[71428]: DEBUG nova.compute.manager [req-f5746e6a-8314-4287-8a0c-96bafec2b286 req-6bc5c730-8725-4899-aec4-b4a6466401c9 service nova] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Received event network-changed-bb5db91e-75d7-4a5c-90c4-66035e0f7628 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:07:09 user nova-compute[71428]: DEBUG nova.compute.manager [req-f5746e6a-8314-4287-8a0c-96bafec2b286 req-6bc5c730-8725-4899-aec4-b4a6466401c9 service nova] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Refreshing instance network info cache due to event network-changed-bb5db91e-75d7-4a5c-90c4-66035e0f7628. {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 23 04:07:09 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-f5746e6a-8314-4287-8a0c-96bafec2b286 req-6bc5c730-8725-4899-aec4-b4a6466401c9 service nova] Acquiring lock "refresh_cache-4205fb36-0df5-4c11-b468-89f59d97352c" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 04:07:09 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-f5746e6a-8314-4287-8a0c-96bafec2b286 req-6bc5c730-8725-4899-aec4-b4a6466401c9 service nova] Acquired lock "refresh_cache-4205fb36-0df5-4c11-b468-89f59d97352c" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 04:07:09 user nova-compute[71428]: DEBUG nova.network.neutron [req-f5746e6a-8314-4287-8a0c-96bafec2b286 req-6bc5c730-8725-4899-aec4-b4a6466401c9 service nova] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Refreshing network info cache for port bb5db91e-75d7-4a5c-90c4-66035e0f7628 {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 23 04:07:09 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:07:10 user nova-compute[71428]: DEBUG nova.network.neutron [req-f5746e6a-8314-4287-8a0c-96bafec2b286 req-6bc5c730-8725-4899-aec4-b4a6466401c9 service nova] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Updated VIF entry in instance network info cache for port bb5db91e-75d7-4a5c-90c4-66035e0f7628. {{(pid=71428) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 23 04:07:10 user nova-compute[71428]: DEBUG nova.network.neutron [req-f5746e6a-8314-4287-8a0c-96bafec2b286 req-6bc5c730-8725-4899-aec4-b4a6466401c9 service nova] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Updating instance_info_cache with network_info: [{"id": "bb5db91e-75d7-4a5c-90c4-66035e0f7628", "address": "fa:16:3e:96:1f:e7", "network": {"id": "cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1309461192-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.31", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7d84d2e6776c4ac38b90b752a36600c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb5db91e-75", "ovs_interfaceid": "bb5db91e-75d7-4a5c-90c4-66035e0f7628", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 04:07:10 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-f5746e6a-8314-4287-8a0c-96bafec2b286 req-6bc5c730-8725-4899-aec4-b4a6466401c9 service nova] Releasing lock "refresh_cache-4205fb36-0df5-4c11-b468-89f59d97352c" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 04:07:11 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6a712214-da73-4b87-8a31-e13d79e6382c tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Acquiring lock "4205fb36-0df5-4c11-b468-89f59d97352c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:07:11 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6a712214-da73-4b87-8a31-e13d79e6382c tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lock "4205fb36-0df5-4c11-b468-89f59d97352c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:07:11 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6a712214-da73-4b87-8a31-e13d79e6382c tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Acquiring lock "4205fb36-0df5-4c11-b468-89f59d97352c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:07:11 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6a712214-da73-4b87-8a31-e13d79e6382c tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lock "4205fb36-0df5-4c11-b468-89f59d97352c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:07:11 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6a712214-da73-4b87-8a31-e13d79e6382c tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lock "4205fb36-0df5-4c11-b468-89f59d97352c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:07:11 user nova-compute[71428]: INFO nova.compute.manager [None req-6a712214-da73-4b87-8a31-e13d79e6382c tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Terminating instance Apr 23 04:07:11 user nova-compute[71428]: DEBUG nova.compute.manager [None req-6a712214-da73-4b87-8a31-e13d79e6382c tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Start destroying the instance on the hypervisor. {{(pid=71428) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 23 04:07:11 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:07:11 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:07:11 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:07:11 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Instance destroyed successfully. Apr 23 04:07:11 user nova-compute[71428]: DEBUG nova.objects.instance [None req-6a712214-da73-4b87-8a31-e13d79e6382c tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lazy-loading 'resources' on Instance uuid 4205fb36-0df5-4c11-b468-89f59d97352c {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 04:07:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-6a712214-da73-4b87-8a31-e13d79e6382c tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T04:05:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1244076250',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-1244076250',id=23,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBByIgBCVO1TwHgSgzcCgt56xToj40uPOeHYofZEu1bJvNmhrfUCOYMcMRL/mO4ojjhrAB0fmobOyBQIrSmCW/jGTnyXoK8G1fv5rap39s97Qi9SpKbhu+T8NGRCCInZWaw==',key_name='tempest-keypair-679082328',keypairs=,launch_index=0,launched_at=2023-04-23T04:05:24Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='7d84d2e6776c4ac38b90b752a36600c3',ramdisk_id='',reservation_id='r-tmf2omnf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeShelveTestJSON-1622749322',owner_user_name='tempest-AttachVolumeShelveTestJSON-1622749322-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-23T04:05:25Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='929f88dec4234641a37fdda799108cf2',uuid=4205fb36-0df5-4c11-b468-89f59d97352c,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bb5db91e-75d7-4a5c-90c4-66035e0f7628", "address": "fa:16:3e:96:1f:e7", "network": {"id": "cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1309461192-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.31", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7d84d2e6776c4ac38b90b752a36600c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb5db91e-75", "ovs_interfaceid": "bb5db91e-75d7-4a5c-90c4-66035e0f7628", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 23 04:07:11 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-6a712214-da73-4b87-8a31-e13d79e6382c tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Converting VIF {"id": "bb5db91e-75d7-4a5c-90c4-66035e0f7628", "address": "fa:16:3e:96:1f:e7", "network": {"id": "cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1309461192-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.31", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7d84d2e6776c4ac38b90b752a36600c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb5db91e-75", "ovs_interfaceid": "bb5db91e-75d7-4a5c-90c4-66035e0f7628", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 04:07:11 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-6a712214-da73-4b87-8a31-e13d79e6382c tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:96:1f:e7,bridge_name='br-int',has_traffic_filtering=True,id=bb5db91e-75d7-4a5c-90c4-66035e0f7628,network=Network(cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb5db91e-75') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 04:07:12 user nova-compute[71428]: DEBUG os_vif [None req-6a712214-da73-4b87-8a31-e13d79e6382c tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:96:1f:e7,bridge_name='br-int',has_traffic_filtering=True,id=bb5db91e-75d7-4a5c-90c4-66035e0f7628,network=Network(cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb5db91e-75') {{(pid=71428) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 23 04:07:12 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:07:12 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbb5db91e-75, bridge=br-int, if_exists=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 04:07:12 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:07:12 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:07:12 user nova-compute[71428]: INFO os_vif [None req-6a712214-da73-4b87-8a31-e13d79e6382c tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:96:1f:e7,bridge_name='br-int',has_traffic_filtering=True,id=bb5db91e-75d7-4a5c-90c4-66035e0f7628,network=Network(cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb5db91e-75') Apr 23 04:07:12 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-6a712214-da73-4b87-8a31-e13d79e6382c tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Deleting instance files /opt/stack/data/nova/instances/4205fb36-0df5-4c11-b468-89f59d97352c_del Apr 23 04:07:12 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-6a712214-da73-4b87-8a31-e13d79e6382c tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Deletion of /opt/stack/data/nova/instances/4205fb36-0df5-4c11-b468-89f59d97352c_del complete Apr 23 04:07:12 user nova-compute[71428]: DEBUG nova.compute.manager [req-87d12946-81d2-4c1f-a741-404107736e2d req-b811c349-0fcd-4564-a1a5-9c2283efa795 service nova] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Received event network-vif-unplugged-bb5db91e-75d7-4a5c-90c4-66035e0f7628 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:07:12 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-87d12946-81d2-4c1f-a741-404107736e2d req-b811c349-0fcd-4564-a1a5-9c2283efa795 service nova] Acquiring lock "4205fb36-0df5-4c11-b468-89f59d97352c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:07:12 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-87d12946-81d2-4c1f-a741-404107736e2d req-b811c349-0fcd-4564-a1a5-9c2283efa795 service nova] Lock "4205fb36-0df5-4c11-b468-89f59d97352c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:07:12 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-87d12946-81d2-4c1f-a741-404107736e2d req-b811c349-0fcd-4564-a1a5-9c2283efa795 service nova] Lock "4205fb36-0df5-4c11-b468-89f59d97352c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:07:12 user nova-compute[71428]: DEBUG nova.compute.manager [req-87d12946-81d2-4c1f-a741-404107736e2d req-b811c349-0fcd-4564-a1a5-9c2283efa795 service nova] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] No waiting events found dispatching network-vif-unplugged-bb5db91e-75d7-4a5c-90c4-66035e0f7628 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 04:07:12 user nova-compute[71428]: DEBUG nova.compute.manager [req-87d12946-81d2-4c1f-a741-404107736e2d req-b811c349-0fcd-4564-a1a5-9c2283efa795 service nova] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Received event network-vif-unplugged-bb5db91e-75d7-4a5c-90c4-66035e0f7628 for instance with task_state deleting. {{(pid=71428) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 23 04:07:12 user nova-compute[71428]: INFO nova.compute.manager [None req-6a712214-da73-4b87-8a31-e13d79e6382c tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Took 0.64 seconds to destroy the instance on the hypervisor. Apr 23 04:07:12 user nova-compute[71428]: DEBUG oslo.service.loopingcall [None req-6a712214-da73-4b87-8a31-e13d79e6382c tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71428) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 23 04:07:12 user nova-compute[71428]: DEBUG nova.compute.manager [-] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Deallocating network for instance {{(pid=71428) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 23 04:07:12 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] deallocate_for_instance() {{(pid=71428) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 23 04:07:12 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:07:12 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Updating instance_info_cache with network_info: [] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 04:07:12 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Took 0.78 seconds to deallocate network for instance. Apr 23 04:07:12 user nova-compute[71428]: DEBUG nova.compute.manager [req-c7241f66-5f89-41c8-b3f9-c1e4240dfd87 req-5b0b02d2-194d-4246-8130-e1256c1dbb1a service nova] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Received event network-vif-deleted-bb5db91e-75d7-4a5c-90c4-66035e0f7628 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:07:12 user nova-compute[71428]: INFO nova.compute.manager [req-c7241f66-5f89-41c8-b3f9-c1e4240dfd87 req-5b0b02d2-194d-4246-8130-e1256c1dbb1a service nova] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Neutron deleted interface bb5db91e-75d7-4a5c-90c4-66035e0f7628; detaching it from the instance and deleting it from the info cache Apr 23 04:07:12 user nova-compute[71428]: DEBUG nova.network.neutron [req-c7241f66-5f89-41c8-b3f9-c1e4240dfd87 req-5b0b02d2-194d-4246-8130-e1256c1dbb1a service nova] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Updating instance_info_cache with network_info: [] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 04:07:12 user nova-compute[71428]: DEBUG nova.compute.manager [req-c7241f66-5f89-41c8-b3f9-c1e4240dfd87 req-5b0b02d2-194d-4246-8130-e1256c1dbb1a service nova] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Detach interface failed, port_id=bb5db91e-75d7-4a5c-90c4-66035e0f7628, reason: Instance 4205fb36-0df5-4c11-b468-89f59d97352c could not be found. {{(pid=71428) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 23 04:07:12 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6a712214-da73-4b87-8a31-e13d79e6382c tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:07:12 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6a712214-da73-4b87-8a31-e13d79e6382c tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:07:13 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-6a712214-da73-4b87-8a31-e13d79e6382c tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 04:07:13 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-6a712214-da73-4b87-8a31-e13d79e6382c tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 04:07:13 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6a712214-da73-4b87-8a31-e13d79e6382c tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.131s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:07:13 user nova-compute[71428]: INFO nova.scheduler.client.report [None req-6a712214-da73-4b87-8a31-e13d79e6382c tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Deleted allocations for instance 4205fb36-0df5-4c11-b468-89f59d97352c Apr 23 04:07:13 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-6a712214-da73-4b87-8a31-e13d79e6382c tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lock "4205fb36-0df5-4c11-b468-89f59d97352c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.730s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:07:14 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:07:14 user nova-compute[71428]: DEBUG nova.compute.manager [req-33c211d0-17e4-4beb-af82-6c635ff18507 req-131e55f8-2468-4167-9c5a-31d135431a7c service nova] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Received event network-vif-plugged-bb5db91e-75d7-4a5c-90c4-66035e0f7628 {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:07:14 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-33c211d0-17e4-4beb-af82-6c635ff18507 req-131e55f8-2468-4167-9c5a-31d135431a7c service nova] Acquiring lock "4205fb36-0df5-4c11-b468-89f59d97352c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:07:14 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-33c211d0-17e4-4beb-af82-6c635ff18507 req-131e55f8-2468-4167-9c5a-31d135431a7c service nova] Lock "4205fb36-0df5-4c11-b468-89f59d97352c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:07:14 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-33c211d0-17e4-4beb-af82-6c635ff18507 req-131e55f8-2468-4167-9c5a-31d135431a7c service nova] Lock "4205fb36-0df5-4c11-b468-89f59d97352c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:07:14 user nova-compute[71428]: DEBUG nova.compute.manager [req-33c211d0-17e4-4beb-af82-6c635ff18507 req-131e55f8-2468-4167-9c5a-31d135431a7c service nova] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] No waiting events found dispatching network-vif-plugged-bb5db91e-75d7-4a5c-90c4-66035e0f7628 {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 04:07:14 user nova-compute[71428]: WARNING nova.compute.manager [req-33c211d0-17e4-4beb-af82-6c635ff18507 req-131e55f8-2468-4167-9c5a-31d135431a7c service nova] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Received unexpected event network-vif-plugged-bb5db91e-75d7-4a5c-90c4-66035e0f7628 for instance with vm_state deleted and task_state None. Apr 23 04:07:16 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:07:16 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Cleaning up deleted instances with incomplete migration {{(pid=71428) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 23 04:07:17 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:07:18 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:07:18 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:07:22 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:07:23 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:07:23 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71428) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 23 04:07:24 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:07:26 user nova-compute[71428]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 04:07:26 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] VM Stopped (Lifecycle Event) Apr 23 04:07:27 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e2c8f973-263d-491e-88fc-21b899007b41 None None] [instance: 4205fb36-0df5-4c11-b468-89f59d97352c] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 04:07:27 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:07:27 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:07:27 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71428) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 23 04:07:27 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 04:07:27 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 04:07:27 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:07:27 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:07:27 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:07:27 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:07:27 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:07:27 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:07:27 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Auditing locally available compute resources for user (node: user) {{(pid=71428) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 23 04:07:27 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8171e321-666d-44fc-a0ef-b297ec22369b/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:07:27 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8171e321-666d-44fc-a0ef-b297ec22369b/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:07:27 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8171e321-666d-44fc-a0ef-b297ec22369b/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:07:28 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8171e321-666d-44fc-a0ef-b297ec22369b/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:07:28 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 04:07:28 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 04:07:28 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Hypervisor/Node resource view: name=user free_ram=8976MB free_disk=26.2509765625GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71428) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 23 04:07:28 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:07:28 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:07:28 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance 8171e321-666d-44fc-a0ef-b297ec22369b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 04:07:28 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 23 04:07:28 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 23 04:07:28 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 04:07:28 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 04:07:28 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Compute_service record updated for user:user {{(pid=71428) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 23 04:07:28 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.305s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:07:30 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:07:30 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Starting heal instance info cache {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 23 04:07:31 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "refresh_cache-8171e321-666d-44fc-a0ef-b297ec22369b" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 04:07:31 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquired lock "refresh_cache-8171e321-666d-44fc-a0ef-b297ec22369b" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 04:07:31 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Forcefully refreshing network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 23 04:07:31 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Updating instance_info_cache with network_info: [{"id": "a1a195c9-37d9-45f8-b8dd-989b54c8f2dd", "address": "fa:16:3e:d2:e0:f1", "network": {"id": "82debf07-b0f5-4ec4-95e8-bd8a03b0cdb9", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-890753090-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f7eb2f69eab54dd8aac3e6cca9d5e46a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1a195c9-37", "ovs_interfaceid": "a1a195c9-37d9-45f8-b8dd-989b54c8f2dd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 04:07:31 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Releasing lock "refresh_cache-8171e321-666d-44fc-a0ef-b297ec22369b" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 04:07:31 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Updated the network info_cache for instance {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 23 04:07:32 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:07:32 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:07:34 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:07:35 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:07:37 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:07:42 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:07:44 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:07:47 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:07:49 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:07:49 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Cleaning up deleted instances {{(pid=71428) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 23 04:07:49 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] There are 0 instances to clean {{(pid=71428) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 23 04:07:52 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:07:57 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:07:57 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:07:57 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71428) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 23 04:07:57 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 04:07:57 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 04:07:57 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:08:02 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:08:02 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:08:05 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Acquiring lock "b04c49a4-646d-43aa-96ea-d835bf673e42" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:08:05 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lock "b04c49a4-646d-43aa-96ea-d835bf673e42" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:08:05 user nova-compute[71428]: DEBUG nova.compute.manager [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Starting instance... {{(pid=71428) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 23 04:08:05 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:08:05 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:08:05 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71428) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 23 04:08:05 user nova-compute[71428]: INFO nova.compute.claims [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Claim successful on node user Apr 23 04:08:05 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 04:08:05 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 04:08:05 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.231s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:08:05 user nova-compute[71428]: DEBUG nova.compute.manager [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Start building networks asynchronously for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 23 04:08:05 user nova-compute[71428]: DEBUG nova.compute.manager [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Allocating IP information in the background. {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 23 04:08:05 user nova-compute[71428]: DEBUG nova.network.neutron [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] allocate_for_instance() {{(pid=71428) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 23 04:08:05 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 23 04:08:05 user nova-compute[71428]: DEBUG nova.compute.manager [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Start building block device mappings for instance. {{(pid=71428) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 23 04:08:06 user nova-compute[71428]: DEBUG nova.policy [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '929f88dec4234641a37fdda799108cf2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7d84d2e6776c4ac38b90b752a36600c3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71428) authorize /opt/stack/nova/nova/policy.py:203}} Apr 23 04:08:06 user nova-compute[71428]: DEBUG nova.compute.manager [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Start spawning the instance on the hypervisor. {{(pid=71428) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 23 04:08:06 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Creating instance directory {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 23 04:08:06 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Creating image(s) Apr 23 04:08:06 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Acquiring lock "/opt/stack/data/nova/instances/b04c49a4-646d-43aa-96ea-d835bf673e42/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:08:06 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lock "/opt/stack/data/nova/instances/b04c49a4-646d-43aa-96ea-d835bf673e42/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:08:06 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lock "/opt/stack/data/nova/instances/b04c49a4-646d-43aa-96ea-d835bf673e42/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:08:06 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:08:06 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.132s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:08:06 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Acquiring lock "935dc3474dddbde110531a31510941caddc0ae83" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:08:06 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:08:06 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:08:06 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.137s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:08:06 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/b04c49a4-646d-43aa-96ea-d835bf673e42/disk 1073741824 {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:08:06 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83,backing_fmt=raw /opt/stack/data/nova/instances/b04c49a4-646d-43aa-96ea-d835bf673e42/disk 1073741824" returned: 0 in 0.048s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:08:06 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lock "935dc3474dddbde110531a31510941caddc0ae83" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.192s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:08:06 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:08:06 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/935dc3474dddbde110531a31510941caddc0ae83 --force-share --output=json" returned: 0 in 0.134s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:08:06 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Checking if we can resize image /opt/stack/data/nova/instances/b04c49a4-646d-43aa-96ea-d835bf673e42/disk. size=1073741824 {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 23 04:08:06 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b04c49a4-646d-43aa-96ea-d835bf673e42/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:08:06 user nova-compute[71428]: DEBUG nova.network.neutron [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Successfully created port: 1b9a901a-8358-4b1c-89a7-de0772c2697e {{(pid=71428) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 23 04:08:06 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b04c49a4-646d-43aa-96ea-d835bf673e42/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:08:06 user nova-compute[71428]: DEBUG nova.virt.disk.api [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Cannot resize image /opt/stack/data/nova/instances/b04c49a4-646d-43aa-96ea-d835bf673e42/disk to a smaller size. {{(pid=71428) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 23 04:08:06 user nova-compute[71428]: DEBUG nova.objects.instance [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lazy-loading 'migration_context' on Instance uuid b04c49a4-646d-43aa-96ea-d835bf673e42 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 04:08:06 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Created local disks {{(pid=71428) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 23 04:08:06 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Ensure instance console log exists: /opt/stack/data/nova/instances/b04c49a4-646d-43aa-96ea-d835bf673e42/console.log {{(pid=71428) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 23 04:08:06 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:08:06 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:08:06 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:08:07 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:08:07 user nova-compute[71428]: DEBUG nova.network.neutron [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Successfully updated port: 1b9a901a-8358-4b1c-89a7-de0772c2697e {{(pid=71428) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 23 04:08:07 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Acquiring lock "refresh_cache-b04c49a4-646d-43aa-96ea-d835bf673e42" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 04:08:07 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Acquired lock "refresh_cache-b04c49a4-646d-43aa-96ea-d835bf673e42" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 04:08:07 user nova-compute[71428]: DEBUG nova.network.neutron [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Building network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 23 04:08:07 user nova-compute[71428]: DEBUG nova.compute.manager [req-b1da0f43-3fae-470f-838d-c9b83c6a3ca6 req-41c810d1-7576-4dc9-98fc-c39b8f63d127 service nova] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Received event network-changed-1b9a901a-8358-4b1c-89a7-de0772c2697e {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:08:07 user nova-compute[71428]: DEBUG nova.compute.manager [req-b1da0f43-3fae-470f-838d-c9b83c6a3ca6 req-41c810d1-7576-4dc9-98fc-c39b8f63d127 service nova] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Refreshing instance network info cache due to event network-changed-1b9a901a-8358-4b1c-89a7-de0772c2697e. {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 23 04:08:07 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-b1da0f43-3fae-470f-838d-c9b83c6a3ca6 req-41c810d1-7576-4dc9-98fc-c39b8f63d127 service nova] Acquiring lock "refresh_cache-b04c49a4-646d-43aa-96ea-d835bf673e42" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 04:08:07 user nova-compute[71428]: DEBUG nova.network.neutron [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Instance cache missing network info. {{(pid=71428) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 23 04:08:07 user nova-compute[71428]: DEBUG nova.network.neutron [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Updating instance_info_cache with network_info: [{"id": "1b9a901a-8358-4b1c-89a7-de0772c2697e", "address": "fa:16:3e:1e:7c:4c", "network": {"id": "cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1309461192-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7d84d2e6776c4ac38b90b752a36600c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b9a901a-83", "ovs_interfaceid": "1b9a901a-8358-4b1c-89a7-de0772c2697e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 04:08:07 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Releasing lock "refresh_cache-b04c49a4-646d-43aa-96ea-d835bf673e42" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 04:08:07 user nova-compute[71428]: DEBUG nova.compute.manager [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Instance network_info: |[{"id": "1b9a901a-8358-4b1c-89a7-de0772c2697e", "address": "fa:16:3e:1e:7c:4c", "network": {"id": "cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1309461192-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7d84d2e6776c4ac38b90b752a36600c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b9a901a-83", "ovs_interfaceid": "1b9a901a-8358-4b1c-89a7-de0772c2697e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71428) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 23 04:08:07 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-b1da0f43-3fae-470f-838d-c9b83c6a3ca6 req-41c810d1-7576-4dc9-98fc-c39b8f63d127 service nova] Acquired lock "refresh_cache-b04c49a4-646d-43aa-96ea-d835bf673e42" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 04:08:07 user nova-compute[71428]: DEBUG nova.network.neutron [req-b1da0f43-3fae-470f-838d-c9b83c6a3ca6 req-41c810d1-7576-4dc9-98fc-c39b8f63d127 service nova] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Refreshing network info cache for port 1b9a901a-8358-4b1c-89a7-de0772c2697e {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 23 04:08:07 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Start _get_guest_xml network_info=[{"id": "1b9a901a-8358-4b1c-89a7-de0772c2697e", "address": "fa:16:3e:1e:7c:4c", "network": {"id": "cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1309461192-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7d84d2e6776c4ac38b90b752a36600c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b9a901a-83", "ovs_interfaceid": "1b9a901a-8358-4b1c-89a7-de0772c2697e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'encryption_options': None, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'image_id': 'e6127373-9931-4277-9458-eceef653ea1e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 23 04:08:07 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 04:08:07 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 04:08:07 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71428) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 23 04:08:07 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-23T03:43:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-23T03:42:31Z,direct_url=,disk_format='qcow2',id=e6127373-9931-4277-9458-eceef653ea1e,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='d5291373e38944d6ba358117c8fc1163',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-23T03:42:33Z,virtual_size=,visibility=), allow threads: True {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 23 04:08:07 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Flavor limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 23 04:08:07 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Image limits 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 23 04:08:07 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Flavor pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 23 04:08:07 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Image pref 0:0:0 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 23 04:08:07 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71428) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 23 04:08:07 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 23 04:08:07 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 23 04:08:07 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Got 1 possible topologies {{(pid=71428) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 23 04:08:07 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 23 04:08:07 user nova-compute[71428]: DEBUG nova.virt.hardware [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71428) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 23 04:08:07 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T04:08:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1737853671',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-1737853671',id=25,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBLaxA+1Weh2HPeZq1FfwnTWJ4BZ8EwnjCZzkTMH7vd1FpYkudJNqvSYFJUarAtJBTeATbAP1ryV5lwSJmvcK7bPPAXyowr2p/BTQS9EUQ/tnhqgthSuv9uMBFC3mW38WQ==',key_name='tempest-keypair-911657479',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7d84d2e6776c4ac38b90b752a36600c3',ramdisk_id='',reservation_id='r-ekb617ud',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-1622749322',owner_user_name='tempest-AttachVolumeShelveTestJSON-1622749322-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T04:08:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='929f88dec4234641a37fdda799108cf2',uuid=b04c49a4-646d-43aa-96ea-d835bf673e42,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1b9a901a-8358-4b1c-89a7-de0772c2697e", "address": "fa:16:3e:1e:7c:4c", "network": {"id": "cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1309461192-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7d84d2e6776c4ac38b90b752a36600c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b9a901a-83", "ovs_interfaceid": "1b9a901a-8358-4b1c-89a7-de0772c2697e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71428) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 23 04:08:07 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Converting VIF {"id": "1b9a901a-8358-4b1c-89a7-de0772c2697e", "address": "fa:16:3e:1e:7c:4c", "network": {"id": "cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1309461192-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7d84d2e6776c4ac38b90b752a36600c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b9a901a-83", "ovs_interfaceid": "1b9a901a-8358-4b1c-89a7-de0772c2697e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 04:08:07 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:7c:4c,bridge_name='br-int',has_traffic_filtering=True,id=1b9a901a-8358-4b1c-89a7-de0772c2697e,network=Network(cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b9a901a-83') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 04:08:07 user nova-compute[71428]: DEBUG nova.objects.instance [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lazy-loading 'pci_devices' on Instance uuid b04c49a4-646d-43aa-96ea-d835bf673e42 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 04:08:07 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] End _get_guest_xml xml= Apr 23 04:08:07 user nova-compute[71428]: b04c49a4-646d-43aa-96ea-d835bf673e42 Apr 23 04:08:07 user nova-compute[71428]: instance-00000019 Apr 23 04:08:07 user nova-compute[71428]: 131072 Apr 23 04:08:07 user nova-compute[71428]: 1 Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: tempest-AttachVolumeShelveTestJSON-server-1737853671 Apr 23 04:08:07 user nova-compute[71428]: 2023-04-23 04:08:07 Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: 128 Apr 23 04:08:07 user nova-compute[71428]: 1 Apr 23 04:08:07 user nova-compute[71428]: 0 Apr 23 04:08:07 user nova-compute[71428]: 0 Apr 23 04:08:07 user nova-compute[71428]: 1 Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: tempest-AttachVolumeShelveTestJSON-1622749322-project-member Apr 23 04:08:07 user nova-compute[71428]: tempest-AttachVolumeShelveTestJSON-1622749322 Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: OpenStack Foundation Apr 23 04:08:07 user nova-compute[71428]: OpenStack Nova Apr 23 04:08:07 user nova-compute[71428]: 0.0.0 Apr 23 04:08:07 user nova-compute[71428]: b04c49a4-646d-43aa-96ea-d835bf673e42 Apr 23 04:08:07 user nova-compute[71428]: b04c49a4-646d-43aa-96ea-d835bf673e42 Apr 23 04:08:07 user nova-compute[71428]: Virtual Machine Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: hvm Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Nehalem Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: /dev/urandom Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: Apr 23 04:08:07 user nova-compute[71428]: {{(pid=71428) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 23 04:08:08 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T04:08:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1737853671',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-1737853671',id=25,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBLaxA+1Weh2HPeZq1FfwnTWJ4BZ8EwnjCZzkTMH7vd1FpYkudJNqvSYFJUarAtJBTeATbAP1ryV5lwSJmvcK7bPPAXyowr2p/BTQS9EUQ/tnhqgthSuv9uMBFC3mW38WQ==',key_name='tempest-keypair-911657479',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7d84d2e6776c4ac38b90b752a36600c3',ramdisk_id='',reservation_id='r-ekb617ud',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-1622749322',owner_user_name='tempest-AttachVolumeShelveTestJSON-1622749322-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-23T04:08:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='929f88dec4234641a37fdda799108cf2',uuid=b04c49a4-646d-43aa-96ea-d835bf673e42,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1b9a901a-8358-4b1c-89a7-de0772c2697e", "address": "fa:16:3e:1e:7c:4c", "network": {"id": "cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1309461192-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7d84d2e6776c4ac38b90b752a36600c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b9a901a-83", "ovs_interfaceid": "1b9a901a-8358-4b1c-89a7-de0772c2697e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 23 04:08:08 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Converting VIF {"id": "1b9a901a-8358-4b1c-89a7-de0772c2697e", "address": "fa:16:3e:1e:7c:4c", "network": {"id": "cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1309461192-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7d84d2e6776c4ac38b90b752a36600c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b9a901a-83", "ovs_interfaceid": "1b9a901a-8358-4b1c-89a7-de0772c2697e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 04:08:08 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:7c:4c,bridge_name='br-int',has_traffic_filtering=True,id=1b9a901a-8358-4b1c-89a7-de0772c2697e,network=Network(cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b9a901a-83') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 04:08:08 user nova-compute[71428]: DEBUG os_vif [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:7c:4c,bridge_name='br-int',has_traffic_filtering=True,id=1b9a901a-8358-4b1c-89a7-de0772c2697e,network=Network(cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b9a901a-83') {{(pid=71428) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 23 04:08:08 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:08:08 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 04:08:08 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 23 04:08:08 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:08:08 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1b9a901a-83, may_exist=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 04:08:08 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1b9a901a-83, col_values=(('external_ids', {'iface-id': '1b9a901a-8358-4b1c-89a7-de0772c2697e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1e:7c:4c', 'vm-uuid': 'b04c49a4-646d-43aa-96ea-d835bf673e42'}),)) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 04:08:08 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:08:08 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:08:08 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:08:08 user nova-compute[71428]: INFO os_vif [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:7c:4c,bridge_name='br-int',has_traffic_filtering=True,id=1b9a901a-8358-4b1c-89a7-de0772c2697e,network=Network(cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b9a901a-83') Apr 23 04:08:08 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] No BDM found with device name vda, not building metadata. {{(pid=71428) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 23 04:08:08 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] No VIF found with MAC fa:16:3e:1e:7c:4c, not building metadata {{(pid=71428) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 23 04:08:08 user nova-compute[71428]: DEBUG nova.network.neutron [req-b1da0f43-3fae-470f-838d-c9b83c6a3ca6 req-41c810d1-7576-4dc9-98fc-c39b8f63d127 service nova] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Updated VIF entry in instance network info cache for port 1b9a901a-8358-4b1c-89a7-de0772c2697e. {{(pid=71428) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 23 04:08:08 user nova-compute[71428]: DEBUG nova.network.neutron [req-b1da0f43-3fae-470f-838d-c9b83c6a3ca6 req-41c810d1-7576-4dc9-98fc-c39b8f63d127 service nova] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Updating instance_info_cache with network_info: [{"id": "1b9a901a-8358-4b1c-89a7-de0772c2697e", "address": "fa:16:3e:1e:7c:4c", "network": {"id": "cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1309461192-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7d84d2e6776c4ac38b90b752a36600c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b9a901a-83", "ovs_interfaceid": "1b9a901a-8358-4b1c-89a7-de0772c2697e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 04:08:08 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-b1da0f43-3fae-470f-838d-c9b83c6a3ca6 req-41c810d1-7576-4dc9-98fc-c39b8f63d127 service nova] Releasing lock "refresh_cache-b04c49a4-646d-43aa-96ea-d835bf673e42" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 04:08:09 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:08:09 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:08:09 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:08:09 user nova-compute[71428]: DEBUG nova.compute.manager [req-c01071bd-6cfd-479e-9c73-17a9a3e9af22 req-65499a3f-689d-4671-ba34-90a78b9ee86f service nova] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Received event network-vif-plugged-1b9a901a-8358-4b1c-89a7-de0772c2697e {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:08:09 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-c01071bd-6cfd-479e-9c73-17a9a3e9af22 req-65499a3f-689d-4671-ba34-90a78b9ee86f service nova] Acquiring lock "b04c49a4-646d-43aa-96ea-d835bf673e42-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:08:09 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-c01071bd-6cfd-479e-9c73-17a9a3e9af22 req-65499a3f-689d-4671-ba34-90a78b9ee86f service nova] Lock "b04c49a4-646d-43aa-96ea-d835bf673e42-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:08:09 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-c01071bd-6cfd-479e-9c73-17a9a3e9af22 req-65499a3f-689d-4671-ba34-90a78b9ee86f service nova] Lock "b04c49a4-646d-43aa-96ea-d835bf673e42-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:08:09 user nova-compute[71428]: DEBUG nova.compute.manager [req-c01071bd-6cfd-479e-9c73-17a9a3e9af22 req-65499a3f-689d-4671-ba34-90a78b9ee86f service nova] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] No waiting events found dispatching network-vif-plugged-1b9a901a-8358-4b1c-89a7-de0772c2697e {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 04:08:09 user nova-compute[71428]: WARNING nova.compute.manager [req-c01071bd-6cfd-479e-9c73-17a9a3e9af22 req-65499a3f-689d-4671-ba34-90a78b9ee86f service nova] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Received unexpected event network-vif-plugged-1b9a901a-8358-4b1c-89a7-de0772c2697e for instance with vm_state building and task_state spawning. Apr 23 04:08:09 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:08:09 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:08:09 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:08:11 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Resumed> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 04:08:11 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] VM Resumed (Lifecycle Event) Apr 23 04:08:11 user nova-compute[71428]: DEBUG nova.compute.manager [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Instance event wait completed in 0 seconds for {{(pid=71428) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 23 04:08:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Guest created on hypervisor {{(pid=71428) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 23 04:08:11 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Instance spawned successfully. Apr 23 04:08:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 23 04:08:11 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 04:08:11 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 04:08:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Found default for hw_cdrom_bus of ide {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 04:08:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Found default for hw_disk_bus of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 04:08:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Found default for hw_input_bus of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 04:08:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Found default for hw_pointer_model of None {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 04:08:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Found default for hw_video_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 04:08:11 user nova-compute[71428]: DEBUG nova.virt.libvirt.driver [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Found default for hw_vif_model of virtio {{(pid=71428) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 23 04:08:11 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 04:08:11 user nova-compute[71428]: DEBUG nova.virt.driver [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] Emitting event Started> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 04:08:11 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] VM Started (Lifecycle Event) Apr 23 04:08:11 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 04:08:11 user nova-compute[71428]: DEBUG nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71428) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 23 04:08:11 user nova-compute[71428]: INFO nova.compute.manager [None req-e0a5ca10-5e3f-4c62-8b72-fd9483a6d9d7 None None] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] During sync_power_state the instance has a pending task (spawning). Skip. Apr 23 04:08:11 user nova-compute[71428]: INFO nova.compute.manager [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Took 5.50 seconds to spawn the instance on the hypervisor. Apr 23 04:08:11 user nova-compute[71428]: DEBUG nova.compute.manager [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 04:08:11 user nova-compute[71428]: DEBUG nova.compute.manager [req-744f2257-ee01-47c8-9833-cb9f702e3d11 req-1767bd83-47a0-4033-8b47-2cfc9db84bcb service nova] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Received event network-vif-plugged-1b9a901a-8358-4b1c-89a7-de0772c2697e {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:08:11 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-744f2257-ee01-47c8-9833-cb9f702e3d11 req-1767bd83-47a0-4033-8b47-2cfc9db84bcb service nova] Acquiring lock "b04c49a4-646d-43aa-96ea-d835bf673e42-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:08:11 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-744f2257-ee01-47c8-9833-cb9f702e3d11 req-1767bd83-47a0-4033-8b47-2cfc9db84bcb service nova] Lock "b04c49a4-646d-43aa-96ea-d835bf673e42-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:08:11 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-744f2257-ee01-47c8-9833-cb9f702e3d11 req-1767bd83-47a0-4033-8b47-2cfc9db84bcb service nova] Lock "b04c49a4-646d-43aa-96ea-d835bf673e42-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:08:11 user nova-compute[71428]: DEBUG nova.compute.manager [req-744f2257-ee01-47c8-9833-cb9f702e3d11 req-1767bd83-47a0-4033-8b47-2cfc9db84bcb service nova] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] No waiting events found dispatching network-vif-plugged-1b9a901a-8358-4b1c-89a7-de0772c2697e {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 04:08:11 user nova-compute[71428]: WARNING nova.compute.manager [req-744f2257-ee01-47c8-9833-cb9f702e3d11 req-1767bd83-47a0-4033-8b47-2cfc9db84bcb service nova] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Received unexpected event network-vif-plugged-1b9a901a-8358-4b1c-89a7-de0772c2697e for instance with vm_state building and task_state spawning. Apr 23 04:08:11 user nova-compute[71428]: INFO nova.compute.manager [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Took 6.05 seconds to build instance. Apr 23 04:08:11 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-b5fe2f7c-67cf-4c3c-a2f3-8520bf7410c8 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lock "b04c49a4-646d-43aa-96ea-d835bf673e42" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.146s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:08:13 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:08:14 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:08:14 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-3f74fe56-632e-4259-80b6-defe80543bca tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Acquiring lock "8171e321-666d-44fc-a0ef-b297ec22369b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:08:14 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-3f74fe56-632e-4259-80b6-defe80543bca tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Lock "8171e321-666d-44fc-a0ef-b297ec22369b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:08:14 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-3f74fe56-632e-4259-80b6-defe80543bca tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Acquiring lock "8171e321-666d-44fc-a0ef-b297ec22369b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:08:14 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-3f74fe56-632e-4259-80b6-defe80543bca tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Lock "8171e321-666d-44fc-a0ef-b297ec22369b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:08:14 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-3f74fe56-632e-4259-80b6-defe80543bca tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Lock "8171e321-666d-44fc-a0ef-b297ec22369b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:08:14 user nova-compute[71428]: INFO nova.compute.manager [None req-3f74fe56-632e-4259-80b6-defe80543bca tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Terminating instance Apr 23 04:08:14 user nova-compute[71428]: DEBUG nova.compute.manager [None req-3f74fe56-632e-4259-80b6-defe80543bca tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Start destroying the instance on the hypervisor. {{(pid=71428) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 23 04:08:14 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:08:14 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:08:14 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:08:14 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:08:14 user nova-compute[71428]: DEBUG nova.compute.manager [req-b4417698-428e-4644-9eda-65ddcad0426e req-16191334-68af-46ae-89f2-fdfe8f7a51b3 service nova] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Received event network-vif-unplugged-a1a195c9-37d9-45f8-b8dd-989b54c8f2dd {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:08:14 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-b4417698-428e-4644-9eda-65ddcad0426e req-16191334-68af-46ae-89f2-fdfe8f7a51b3 service nova] Acquiring lock "8171e321-666d-44fc-a0ef-b297ec22369b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:08:14 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-b4417698-428e-4644-9eda-65ddcad0426e req-16191334-68af-46ae-89f2-fdfe8f7a51b3 service nova] Lock "8171e321-666d-44fc-a0ef-b297ec22369b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:08:14 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-b4417698-428e-4644-9eda-65ddcad0426e req-16191334-68af-46ae-89f2-fdfe8f7a51b3 service nova] Lock "8171e321-666d-44fc-a0ef-b297ec22369b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:08:14 user nova-compute[71428]: DEBUG nova.compute.manager [req-b4417698-428e-4644-9eda-65ddcad0426e req-16191334-68af-46ae-89f2-fdfe8f7a51b3 service nova] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] No waiting events found dispatching network-vif-unplugged-a1a195c9-37d9-45f8-b8dd-989b54c8f2dd {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 04:08:14 user nova-compute[71428]: DEBUG nova.compute.manager [req-b4417698-428e-4644-9eda-65ddcad0426e req-16191334-68af-46ae-89f2-fdfe8f7a51b3 service nova] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Received event network-vif-unplugged-a1a195c9-37d9-45f8-b8dd-989b54c8f2dd for instance with task_state deleting. {{(pid=71428) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 23 04:08:15 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Instance destroyed successfully. Apr 23 04:08:15 user nova-compute[71428]: DEBUG nova.objects.instance [None req-3f74fe56-632e-4259-80b6-defe80543bca tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Lazy-loading 'resources' on Instance uuid 8171e321-666d-44fc-a0ef-b297ec22369b {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 04:08:15 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-3f74fe56-632e-4259-80b6-defe80543bca tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T04:06:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-SnapshotDataIntegrityTests-server-1589353047',display_name='tempest-SnapshotDataIntegrityTests-server-1589353047',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-snapshotdataintegritytests-server-1589353047',id=24,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAu60RPBRY8ixX0DVPXyTC0obHZtMSQn+/eQe4CX41J6ZGl9/35/LZ6Mg27oUkmRdsopV1StlsbxkgFLQHMAYMn979AL7gketvktWqIxDsED8DUOxN69qCEepNW76Mz/YA==',key_name='tempest-SnapshotDataIntegrityTests-739066652',keypairs=,launch_index=0,launched_at=2023-04-23T04:06:28Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='f7eb2f69eab54dd8aac3e6cca9d5e46a',ramdisk_id='',reservation_id='r-1jl7tgfp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-SnapshotDataIntegrityTests-64123451',owner_user_name='tempest-SnapshotDataIntegrityTests-64123451-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-23T04:06:28Z,user_data=None,user_id='c452706dd7dc4f54b45e9f959288bd6b',uuid=8171e321-666d-44fc-a0ef-b297ec22369b,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a1a195c9-37d9-45f8-b8dd-989b54c8f2dd", "address": "fa:16:3e:d2:e0:f1", "network": {"id": "82debf07-b0f5-4ec4-95e8-bd8a03b0cdb9", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-890753090-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f7eb2f69eab54dd8aac3e6cca9d5e46a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1a195c9-37", "ovs_interfaceid": "a1a195c9-37d9-45f8-b8dd-989b54c8f2dd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 23 04:08:15 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-3f74fe56-632e-4259-80b6-defe80543bca tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Converting VIF {"id": "a1a195c9-37d9-45f8-b8dd-989b54c8f2dd", "address": "fa:16:3e:d2:e0:f1", "network": {"id": "82debf07-b0f5-4ec4-95e8-bd8a03b0cdb9", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-890753090-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f7eb2f69eab54dd8aac3e6cca9d5e46a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1a195c9-37", "ovs_interfaceid": "a1a195c9-37d9-45f8-b8dd-989b54c8f2dd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 04:08:15 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-3f74fe56-632e-4259-80b6-defe80543bca tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d2:e0:f1,bridge_name='br-int',has_traffic_filtering=True,id=a1a195c9-37d9-45f8-b8dd-989b54c8f2dd,network=Network(82debf07-b0f5-4ec4-95e8-bd8a03b0cdb9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa1a195c9-37') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 04:08:15 user nova-compute[71428]: DEBUG os_vif [None req-3f74fe56-632e-4259-80b6-defe80543bca tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d2:e0:f1,bridge_name='br-int',has_traffic_filtering=True,id=a1a195c9-37d9-45f8-b8dd-989b54c8f2dd,network=Network(82debf07-b0f5-4ec4-95e8-bd8a03b0cdb9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa1a195c9-37') {{(pid=71428) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 23 04:08:15 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:08:15 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa1a195c9-37, bridge=br-int, if_exists=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 04:08:15 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:08:15 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:08:15 user nova-compute[71428]: INFO os_vif [None req-3f74fe56-632e-4259-80b6-defe80543bca tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d2:e0:f1,bridge_name='br-int',has_traffic_filtering=True,id=a1a195c9-37d9-45f8-b8dd-989b54c8f2dd,network=Network(82debf07-b0f5-4ec4-95e8-bd8a03b0cdb9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa1a195c9-37') Apr 23 04:08:15 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-3f74fe56-632e-4259-80b6-defe80543bca tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Deleting instance files /opt/stack/data/nova/instances/8171e321-666d-44fc-a0ef-b297ec22369b_del Apr 23 04:08:15 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-3f74fe56-632e-4259-80b6-defe80543bca tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Deletion of /opt/stack/data/nova/instances/8171e321-666d-44fc-a0ef-b297ec22369b_del complete Apr 23 04:08:15 user nova-compute[71428]: INFO nova.compute.manager [None req-3f74fe56-632e-4259-80b6-defe80543bca tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Took 0.85 seconds to destroy the instance on the hypervisor. Apr 23 04:08:15 user nova-compute[71428]: DEBUG oslo.service.loopingcall [None req-3f74fe56-632e-4259-80b6-defe80543bca tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71428) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 23 04:08:15 user nova-compute[71428]: DEBUG nova.compute.manager [-] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Deallocating network for instance {{(pid=71428) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 23 04:08:15 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] deallocate_for_instance() {{(pid=71428) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 23 04:08:16 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Updating instance_info_cache with network_info: [] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 04:08:16 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Took 0.91 seconds to deallocate network for instance. Apr 23 04:08:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-3f74fe56-632e-4259-80b6-defe80543bca tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:08:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-3f74fe56-632e-4259-80b6-defe80543bca tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:08:16 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-3f74fe56-632e-4259-80b6-defe80543bca tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 04:08:16 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-3f74fe56-632e-4259-80b6-defe80543bca tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 04:08:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-3f74fe56-632e-4259-80b6-defe80543bca tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.123s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:08:16 user nova-compute[71428]: INFO nova.scheduler.client.report [None req-3f74fe56-632e-4259-80b6-defe80543bca tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Deleted allocations for instance 8171e321-666d-44fc-a0ef-b297ec22369b Apr 23 04:08:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-3f74fe56-632e-4259-80b6-defe80543bca tempest-SnapshotDataIntegrityTests-64123451 tempest-SnapshotDataIntegrityTests-64123451-project-member] Lock "8171e321-666d-44fc-a0ef-b297ec22369b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.047s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:08:16 user nova-compute[71428]: DEBUG nova.compute.manager [req-723a6a95-d227-4315-be3f-617f487434f8 req-1a9dd17f-3806-4fe8-99d2-8721ef682f51 service nova] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Received event network-vif-plugged-a1a195c9-37d9-45f8-b8dd-989b54c8f2dd {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:08:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-723a6a95-d227-4315-be3f-617f487434f8 req-1a9dd17f-3806-4fe8-99d2-8721ef682f51 service nova] Acquiring lock "8171e321-666d-44fc-a0ef-b297ec22369b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:08:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-723a6a95-d227-4315-be3f-617f487434f8 req-1a9dd17f-3806-4fe8-99d2-8721ef682f51 service nova] Lock "8171e321-666d-44fc-a0ef-b297ec22369b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:08:16 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-723a6a95-d227-4315-be3f-617f487434f8 req-1a9dd17f-3806-4fe8-99d2-8721ef682f51 service nova] Lock "8171e321-666d-44fc-a0ef-b297ec22369b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:08:16 user nova-compute[71428]: DEBUG nova.compute.manager [req-723a6a95-d227-4315-be3f-617f487434f8 req-1a9dd17f-3806-4fe8-99d2-8721ef682f51 service nova] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] No waiting events found dispatching network-vif-plugged-a1a195c9-37d9-45f8-b8dd-989b54c8f2dd {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 04:08:16 user nova-compute[71428]: WARNING nova.compute.manager [req-723a6a95-d227-4315-be3f-617f487434f8 req-1a9dd17f-3806-4fe8-99d2-8721ef682f51 service nova] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Received unexpected event network-vif-plugged-a1a195c9-37d9-45f8-b8dd-989b54c8f2dd for instance with vm_state deleted and task_state None. Apr 23 04:08:16 user nova-compute[71428]: DEBUG nova.compute.manager [req-723a6a95-d227-4315-be3f-617f487434f8 req-1a9dd17f-3806-4fe8-99d2-8721ef682f51 service nova] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Received event network-vif-deleted-a1a195c9-37d9-45f8-b8dd-989b54c8f2dd {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:08:18 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:08:20 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:08:24 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:08:25 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:08:25 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:08:25 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71428) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 23 04:08:26 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:08:28 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:08:29 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:08:29 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:08:29 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:08:29 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:08:29 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:08:29 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Auditing locally available compute resources for user (node: user) {{(pid=71428) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 23 04:08:29 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b04c49a4-646d-43aa-96ea-d835bf673e42/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:08:29 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b04c49a4-646d-43aa-96ea-d835bf673e42/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:08:29 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b04c49a4-646d-43aa-96ea-d835bf673e42/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:08:30 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b04c49a4-646d-43aa-96ea-d835bf673e42/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:08:30 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:08:30 user nova-compute[71428]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 04:08:30 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] VM Stopped (Lifecycle Event) Apr 23 04:08:30 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 04:08:30 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 04:08:30 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Hypervisor/Node resource view: name=user free_ram=8950MB free_disk=26.248844146728516GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71428) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 23 04:08:30 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:08:30 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:08:30 user nova-compute[71428]: DEBUG nova.compute.manager [None req-95728130-ce6b-4707-b07d-98916aca3cfc None None] [instance: 8171e321-666d-44fc-a0ef-b297ec22369b] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 04:08:30 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance b04c49a4-646d-43aa-96ea-d835bf673e42 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 04:08:30 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 23 04:08:30 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 23 04:08:30 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Refreshing inventories for resource provider 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 23 04:08:30 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Updating ProviderTree inventory for provider 3017e09c-9289-4a8e-8061-3ff90149e985 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 23 04:08:30 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Updating inventory in ProviderTree for provider 3017e09c-9289-4a8e-8061-3ff90149e985 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 23 04:08:30 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Refreshing aggregate associations for resource provider 3017e09c-9289-4a8e-8061-3ff90149e985, aggregates: None {{(pid=71428) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 23 04:08:30 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Refreshing trait associations for resource provider 3017e09c-9289-4a8e-8061-3ff90149e985, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE41,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_RESCUE_BFV,COMPUTE_NODE,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_GRAPHICS_MODEL_CIRRUS {{(pid=71428) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 23 04:08:30 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 04:08:30 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 04:08:30 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Compute_service record updated for user:user {{(pid=71428) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 23 04:08:30 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.329s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:08:31 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:08:31 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Starting heal instance info cache {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 23 04:08:31 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Rebuilding the list of instances to heal {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 23 04:08:31 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "refresh_cache-b04c49a4-646d-43aa-96ea-d835bf673e42" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 04:08:31 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquired lock "refresh_cache-b04c49a4-646d-43aa-96ea-d835bf673e42" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 04:08:31 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Forcefully refreshing network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 23 04:08:31 user nova-compute[71428]: DEBUG nova.objects.instance [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lazy-loading 'info_cache' on Instance uuid b04c49a4-646d-43aa-96ea-d835bf673e42 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 04:08:32 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Updating instance_info_cache with network_info: [{"id": "1b9a901a-8358-4b1c-89a7-de0772c2697e", "address": "fa:16:3e:1e:7c:4c", "network": {"id": "cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1309461192-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7d84d2e6776c4ac38b90b752a36600c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b9a901a-83", "ovs_interfaceid": "1b9a901a-8358-4b1c-89a7-de0772c2697e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 04:08:32 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Releasing lock "refresh_cache-b04c49a4-646d-43aa-96ea-d835bf673e42" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 04:08:32 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Updated the network info_cache for instance {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 23 04:08:32 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:08:34 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:08:35 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:08:37 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:08:37 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:08:39 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:08:40 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:08:45 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:08:50 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:08:55 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:08:59 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:09:00 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:09:05 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:09:05 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:09:07 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:09:09 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:09:10 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:09:15 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:09:18 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:09:20 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:09:25 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:09:26 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:09:27 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:09:27 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71428) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 23 04:09:29 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:09:30 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:09:31 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:09:31 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Starting heal instance info cache {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 23 04:09:31 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Rebuilding the list of instances to heal {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 23 04:09:31 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "refresh_cache-b04c49a4-646d-43aa-96ea-d835bf673e42" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 04:09:31 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquired lock "refresh_cache-b04c49a4-646d-43aa-96ea-d835bf673e42" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 04:09:31 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Forcefully refreshing network info cache for instance {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 23 04:09:31 user nova-compute[71428]: DEBUG nova.objects.instance [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lazy-loading 'info_cache' on Instance uuid b04c49a4-646d-43aa-96ea-d835bf673e42 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 04:09:32 user nova-compute[71428]: DEBUG nova.network.neutron [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Updating instance_info_cache with network_info: [{"id": "1b9a901a-8358-4b1c-89a7-de0772c2697e", "address": "fa:16:3e:1e:7c:4c", "network": {"id": "cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1309461192-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7d84d2e6776c4ac38b90b752a36600c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b9a901a-83", "ovs_interfaceid": "1b9a901a-8358-4b1c-89a7-de0772c2697e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 04:09:32 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Releasing lock "refresh_cache-b04c49a4-646d-43aa-96ea-d835bf673e42" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 04:09:32 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Updated the network info_cache for instance {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 23 04:09:32 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:09:32 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:09:32 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:09:32 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:09:32 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Auditing locally available compute resources for user (node: user) {{(pid=71428) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 23 04:09:32 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b04c49a4-646d-43aa-96ea-d835bf673e42/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:09:32 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b04c49a4-646d-43aa-96ea-d835bf673e42/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:09:32 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b04c49a4-646d-43aa-96ea-d835bf673e42/disk --force-share --output=json {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 23 04:09:32 user nova-compute[71428]: DEBUG oslo_concurrency.processutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b04c49a4-646d-43aa-96ea-d835bf673e42/disk --force-share --output=json" returned: 0 in 0.124s {{(pid=71428) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 23 04:09:33 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 04:09:33 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 04:09:33 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Hypervisor/Node resource view: name=user free_ram=9048MB free_disk=26.24773406982422GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71428) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 23 04:09:33 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:09:33 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:09:33 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Instance b04c49a4-646d-43aa-96ea-d835bf673e42 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71428) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 23 04:09:33 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 23 04:09:33 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 23 04:09:33 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 04:09:33 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 04:09:33 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Compute_service record updated for user:user {{(pid=71428) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 23 04:09:33 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.189s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:09:34 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:09:34 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:09:35 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:09:36 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:09:38 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:09:40 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:09:45 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:09:45 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:09:45 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71428) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 23 04:09:45 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 04:09:45 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71428) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 23 04:09:45 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:09:50 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:09:55 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:09:56 user nova-compute[71428]: DEBUG nova.compute.manager [req-65e08d16-7762-478f-ba9a-471f9ca756e1 req-08a3ac72-d586-4869-945a-7262c5e14e09 service nova] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Received event network-changed-1b9a901a-8358-4b1c-89a7-de0772c2697e {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:09:56 user nova-compute[71428]: DEBUG nova.compute.manager [req-65e08d16-7762-478f-ba9a-471f9ca756e1 req-08a3ac72-d586-4869-945a-7262c5e14e09 service nova] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Refreshing instance network info cache due to event network-changed-1b9a901a-8358-4b1c-89a7-de0772c2697e. {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 23 04:09:56 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-65e08d16-7762-478f-ba9a-471f9ca756e1 req-08a3ac72-d586-4869-945a-7262c5e14e09 service nova] Acquiring lock "refresh_cache-b04c49a4-646d-43aa-96ea-d835bf673e42" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 23 04:09:56 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-65e08d16-7762-478f-ba9a-471f9ca756e1 req-08a3ac72-d586-4869-945a-7262c5e14e09 service nova] Acquired lock "refresh_cache-b04c49a4-646d-43aa-96ea-d835bf673e42" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 23 04:09:56 user nova-compute[71428]: DEBUG nova.network.neutron [req-65e08d16-7762-478f-ba9a-471f9ca756e1 req-08a3ac72-d586-4869-945a-7262c5e14e09 service nova] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Refreshing network info cache for port 1b9a901a-8358-4b1c-89a7-de0772c2697e {{(pid=71428) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 23 04:09:57 user nova-compute[71428]: DEBUG nova.network.neutron [req-65e08d16-7762-478f-ba9a-471f9ca756e1 req-08a3ac72-d586-4869-945a-7262c5e14e09 service nova] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Updated VIF entry in instance network info cache for port 1b9a901a-8358-4b1c-89a7-de0772c2697e. {{(pid=71428) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 23 04:09:57 user nova-compute[71428]: DEBUG nova.network.neutron [req-65e08d16-7762-478f-ba9a-471f9ca756e1 req-08a3ac72-d586-4869-945a-7262c5e14e09 service nova] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Updating instance_info_cache with network_info: [{"id": "1b9a901a-8358-4b1c-89a7-de0772c2697e", "address": "fa:16:3e:1e:7c:4c", "network": {"id": "cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1309461192-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.216", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7d84d2e6776c4ac38b90b752a36600c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b9a901a-83", "ovs_interfaceid": "1b9a901a-8358-4b1c-89a7-de0772c2697e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 04:09:57 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-65e08d16-7762-478f-ba9a-471f9ca756e1 req-08a3ac72-d586-4869-945a-7262c5e14e09 service nova] Releasing lock "refresh_cache-b04c49a4-646d-43aa-96ea-d835bf673e42" {{(pid=71428) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 23 04:09:58 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-94f073b6-24e2-4a44-ba96-5dc77248d5e5 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Acquiring lock "b04c49a4-646d-43aa-96ea-d835bf673e42" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:09:58 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-94f073b6-24e2-4a44-ba96-5dc77248d5e5 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lock "b04c49a4-646d-43aa-96ea-d835bf673e42" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:09:58 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-94f073b6-24e2-4a44-ba96-5dc77248d5e5 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Acquiring lock "b04c49a4-646d-43aa-96ea-d835bf673e42-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:09:58 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-94f073b6-24e2-4a44-ba96-5dc77248d5e5 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lock "b04c49a4-646d-43aa-96ea-d835bf673e42-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:09:58 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-94f073b6-24e2-4a44-ba96-5dc77248d5e5 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lock "b04c49a4-646d-43aa-96ea-d835bf673e42-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:09:58 user nova-compute[71428]: INFO nova.compute.manager [None req-94f073b6-24e2-4a44-ba96-5dc77248d5e5 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Terminating instance Apr 23 04:09:58 user nova-compute[71428]: DEBUG nova.compute.manager [None req-94f073b6-24e2-4a44-ba96-5dc77248d5e5 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Start destroying the instance on the hypervisor. {{(pid=71428) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 23 04:09:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:09:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:09:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:09:58 user nova-compute[71428]: DEBUG nova.compute.manager [req-02d7b007-578c-4582-9e63-90092ad25bff req-ecfa2430-3e9f-4b6a-bf86-0293ae90dcf0 service nova] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Received event network-vif-unplugged-1b9a901a-8358-4b1c-89a7-de0772c2697e {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:09:58 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-02d7b007-578c-4582-9e63-90092ad25bff req-ecfa2430-3e9f-4b6a-bf86-0293ae90dcf0 service nova] Acquiring lock "b04c49a4-646d-43aa-96ea-d835bf673e42-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:09:58 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-02d7b007-578c-4582-9e63-90092ad25bff req-ecfa2430-3e9f-4b6a-bf86-0293ae90dcf0 service nova] Lock "b04c49a4-646d-43aa-96ea-d835bf673e42-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:09:58 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-02d7b007-578c-4582-9e63-90092ad25bff req-ecfa2430-3e9f-4b6a-bf86-0293ae90dcf0 service nova] Lock "b04c49a4-646d-43aa-96ea-d835bf673e42-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:09:58 user nova-compute[71428]: DEBUG nova.compute.manager [req-02d7b007-578c-4582-9e63-90092ad25bff req-ecfa2430-3e9f-4b6a-bf86-0293ae90dcf0 service nova] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] No waiting events found dispatching network-vif-unplugged-1b9a901a-8358-4b1c-89a7-de0772c2697e {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 04:09:58 user nova-compute[71428]: DEBUG nova.compute.manager [req-02d7b007-578c-4582-9e63-90092ad25bff req-ecfa2430-3e9f-4b6a-bf86-0293ae90dcf0 service nova] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Received event network-vif-unplugged-1b9a901a-8358-4b1c-89a7-de0772c2697e for instance with task_state deleting. {{(pid=71428) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 23 04:09:58 user nova-compute[71428]: DEBUG nova.compute.manager [req-02d7b007-578c-4582-9e63-90092ad25bff req-ecfa2430-3e9f-4b6a-bf86-0293ae90dcf0 service nova] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Received event network-vif-plugged-1b9a901a-8358-4b1c-89a7-de0772c2697e {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:09:58 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-02d7b007-578c-4582-9e63-90092ad25bff req-ecfa2430-3e9f-4b6a-bf86-0293ae90dcf0 service nova] Acquiring lock "b04c49a4-646d-43aa-96ea-d835bf673e42-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:09:58 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-02d7b007-578c-4582-9e63-90092ad25bff req-ecfa2430-3e9f-4b6a-bf86-0293ae90dcf0 service nova] Lock "b04c49a4-646d-43aa-96ea-d835bf673e42-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:09:58 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [req-02d7b007-578c-4582-9e63-90092ad25bff req-ecfa2430-3e9f-4b6a-bf86-0293ae90dcf0 service nova] Lock "b04c49a4-646d-43aa-96ea-d835bf673e42-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:09:58 user nova-compute[71428]: DEBUG nova.compute.manager [req-02d7b007-578c-4582-9e63-90092ad25bff req-ecfa2430-3e9f-4b6a-bf86-0293ae90dcf0 service nova] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] No waiting events found dispatching network-vif-plugged-1b9a901a-8358-4b1c-89a7-de0772c2697e {{(pid=71428) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 23 04:09:58 user nova-compute[71428]: WARNING nova.compute.manager [req-02d7b007-578c-4582-9e63-90092ad25bff req-ecfa2430-3e9f-4b6a-bf86-0293ae90dcf0 service nova] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Received unexpected event network-vif-plugged-1b9a901a-8358-4b1c-89a7-de0772c2697e for instance with vm_state active and task_state deleting. Apr 23 04:09:58 user nova-compute[71428]: INFO nova.virt.libvirt.driver [-] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Instance destroyed successfully. Apr 23 04:09:58 user nova-compute[71428]: DEBUG nova.objects.instance [None req-94f073b6-24e2-4a44-ba96-5dc77248d5e5 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lazy-loading 'resources' on Instance uuid b04c49a4-646d-43aa-96ea-d835bf673e42 {{(pid=71428) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 23 04:09:58 user nova-compute[71428]: DEBUG nova.virt.libvirt.vif [None req-94f073b6-24e2-4a44-ba96-5dc77248d5e5 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-23T04:08:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1737853671',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-1737853671',id=25,image_ref='e6127373-9931-4277-9458-eceef653ea1e',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBLaxA+1Weh2HPeZq1FfwnTWJ4BZ8EwnjCZzkTMH7vd1FpYkudJNqvSYFJUarAtJBTeATbAP1ryV5lwSJmvcK7bPPAXyowr2p/BTQS9EUQ/tnhqgthSuv9uMBFC3mW38WQ==',key_name='tempest-keypair-911657479',keypairs=,launch_index=0,launched_at=2023-04-23T04:08:11Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='7d84d2e6776c4ac38b90b752a36600c3',ramdisk_id='',reservation_id='r-ekb617ud',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='e6127373-9931-4277-9458-eceef653ea1e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeShelveTestJSON-1622749322',owner_user_name='tempest-AttachVolumeShelveTestJSON-1622749322-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-23T04:08:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='929f88dec4234641a37fdda799108cf2',uuid=b04c49a4-646d-43aa-96ea-d835bf673e42,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1b9a901a-8358-4b1c-89a7-de0772c2697e", "address": "fa:16:3e:1e:7c:4c", "network": {"id": "cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1309461192-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.216", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7d84d2e6776c4ac38b90b752a36600c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b9a901a-83", "ovs_interfaceid": "1b9a901a-8358-4b1c-89a7-de0772c2697e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 23 04:09:58 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-94f073b6-24e2-4a44-ba96-5dc77248d5e5 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Converting VIF {"id": "1b9a901a-8358-4b1c-89a7-de0772c2697e", "address": "fa:16:3e:1e:7c:4c", "network": {"id": "cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1309461192-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.216", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "7d84d2e6776c4ac38b90b752a36600c3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b9a901a-83", "ovs_interfaceid": "1b9a901a-8358-4b1c-89a7-de0772c2697e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 23 04:09:58 user nova-compute[71428]: DEBUG nova.network.os_vif_util [None req-94f073b6-24e2-4a44-ba96-5dc77248d5e5 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1e:7c:4c,bridge_name='br-int',has_traffic_filtering=True,id=1b9a901a-8358-4b1c-89a7-de0772c2697e,network=Network(cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b9a901a-83') {{(pid=71428) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 23 04:09:58 user nova-compute[71428]: DEBUG os_vif [None req-94f073b6-24e2-4a44-ba96-5dc77248d5e5 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:7c:4c,bridge_name='br-int',has_traffic_filtering=True,id=1b9a901a-8358-4b1c-89a7-de0772c2697e,network=Network(cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b9a901a-83') {{(pid=71428) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 23 04:09:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:09:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1b9a901a-83, bridge=br-int, if_exists=True) {{(pid=71428) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 23 04:09:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:09:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:09:58 user nova-compute[71428]: INFO os_vif [None req-94f073b6-24e2-4a44-ba96-5dc77248d5e5 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:7c:4c,bridge_name='br-int',has_traffic_filtering=True,id=1b9a901a-8358-4b1c-89a7-de0772c2697e,network=Network(cf3a95ac-e77f-4ee9-ba56-283fc32b7f8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b9a901a-83') Apr 23 04:09:58 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-94f073b6-24e2-4a44-ba96-5dc77248d5e5 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Deleting instance files /opt/stack/data/nova/instances/b04c49a4-646d-43aa-96ea-d835bf673e42_del Apr 23 04:09:58 user nova-compute[71428]: INFO nova.virt.libvirt.driver [None req-94f073b6-24e2-4a44-ba96-5dc77248d5e5 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Deletion of /opt/stack/data/nova/instances/b04c49a4-646d-43aa-96ea-d835bf673e42_del complete Apr 23 04:09:58 user nova-compute[71428]: INFO nova.compute.manager [None req-94f073b6-24e2-4a44-ba96-5dc77248d5e5 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Took 0.68 seconds to destroy the instance on the hypervisor. Apr 23 04:09:58 user nova-compute[71428]: DEBUG oslo.service.loopingcall [None req-94f073b6-24e2-4a44-ba96-5dc77248d5e5 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71428) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 23 04:09:58 user nova-compute[71428]: DEBUG nova.compute.manager [-] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Deallocating network for instance {{(pid=71428) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 23 04:09:58 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] deallocate_for_instance() {{(pid=71428) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 23 04:09:59 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:09:59 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:09:59 user nova-compute[71428]: DEBUG nova.network.neutron [-] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Updating instance_info_cache with network_info: [] {{(pid=71428) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 23 04:09:59 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Took 1.17 seconds to deallocate network for instance. Apr 23 04:10:00 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-94f073b6-24e2-4a44-ba96-5dc77248d5e5 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:10:00 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-94f073b6-24e2-4a44-ba96-5dc77248d5e5 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:10:00 user nova-compute[71428]: DEBUG nova.compute.manager [req-4a9f10d9-7b61-4056-a308-2fe3454360c5 req-d2eeec65-96dd-497e-b95a-6a2239348a22 service nova] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Received event network-vif-deleted-1b9a901a-8358-4b1c-89a7-de0772c2697e {{(pid=71428) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 23 04:10:00 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-94f073b6-24e2-4a44-ba96-5dc77248d5e5 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 04:10:00 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-94f073b6-24e2-4a44-ba96-5dc77248d5e5 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 04:10:00 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-94f073b6-24e2-4a44-ba96-5dc77248d5e5 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.137s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:10:00 user nova-compute[71428]: INFO nova.scheduler.client.report [None req-94f073b6-24e2-4a44-ba96-5dc77248d5e5 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Deleted allocations for instance b04c49a4-646d-43aa-96ea-d835bf673e42 Apr 23 04:10:00 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-94f073b6-24e2-4a44-ba96-5dc77248d5e5 tempest-AttachVolumeShelveTestJSON-1622749322 tempest-AttachVolumeShelveTestJSON-1622749322-project-member] Lock "b04c49a4-646d-43aa-96ea-d835bf673e42" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.175s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:10:03 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:10:04 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:10:08 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:10:13 user nova-compute[71428]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71428) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 23 04:10:13 user nova-compute[71428]: INFO nova.compute.manager [-] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] VM Stopped (Lifecycle Event) Apr 23 04:10:13 user nova-compute[71428]: DEBUG nova.compute.manager [None req-d2bc66e9-28b2-4ed6-af64-63150f66b149 None None] [instance: b04c49a4-646d-43aa-96ea-d835bf673e42] Checking state {{(pid=71428) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 23 04:10:13 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:10:18 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:10:18 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:10:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:10:26 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:10:28 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:10:29 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:10:29 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71428) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 23 04:10:30 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:10:32 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:10:32 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Starting heal instance info cache {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 23 04:10:32 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Rebuilding the list of instances to heal {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 23 04:10:32 user nova-compute[71428]: DEBUG nova.compute.manager [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Didn't find any instances for network info cache update. {{(pid=71428) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 23 04:10:33 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:10:33 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:10:33 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:10:33 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:10:33 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Auditing locally available compute resources for user (node: user) {{(pid=71428) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 23 04:10:33 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:10:34 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 04:10:34 user nova-compute[71428]: WARNING nova.virt.libvirt.driver [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 23 04:10:34 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Hypervisor/Node resource view: name=user free_ram=9164MB free_disk=26.266624450683594GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71428) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 23 04:10:34 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 23 04:10:34 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 23 04:10:34 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 23 04:10:34 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71428) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 23 04:10:34 user nova-compute[71428]: DEBUG nova.compute.provider_tree [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed in ProviderTree for provider: 3017e09c-9289-4a8e-8061-3ff90149e985 {{(pid=71428) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 23 04:10:34 user nova-compute[71428]: DEBUG nova.scheduler.client.report [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Inventory has not changed for provider 3017e09c-9289-4a8e-8061-3ff90149e985 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71428) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 23 04:10:34 user nova-compute[71428]: DEBUG nova.compute.resource_tracker [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Compute_service record updated for user:user {{(pid=71428) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 23 04:10:34 user nova-compute[71428]: DEBUG oslo_concurrency.lockutils [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.148s {{(pid=71428) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 23 04:10:36 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:10:36 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:10:38 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:10:38 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:10:40 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:10:43 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:10:48 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:10:50 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:10:53 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:10:53 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:10:58 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 23 04:11:03 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:11:08 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:11:13 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:11:18 user nova-compute[71428]: DEBUG oslo_service.periodic_task [None req-7efbf6cf-fed2-4481-81dd-9e866d8cb8b4 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71428) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 23 04:11:18 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 23 04:11:23 user nova-compute[71428]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 {{(pid=71428) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}}