Apr 18 16:09:49 user nova-compute[70975]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. Apr 18 16:09:52 user nova-compute[70975]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=70975) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=70975) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=70975) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} Apr 18 16:09:52 user nova-compute[70975]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.020s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:09:52 user nova-compute[70975]: INFO nova.virt.driver [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] Loading compute driver 'libvirt.LibvirtDriver' Apr 18 16:09:52 user nova-compute[70975]: INFO nova.compute.provider_config [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] Acquiring lock "singleton_lock" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] Acquired lock "singleton_lock" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] Releasing lock "singleton_lock" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] Full set of CONF: {{(pid=70975) _wait_for_exit_or_signal /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:362}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] ******************************************************************************** {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] Configuration options gathered from: {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] command line args: ['--config-file', '/etc/nova/nova-cpu.conf'] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] config files: ['/etc/nova/nova-cpu.conf'] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] ================================================================================ {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] allow_resize_to_same_host = True {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] arq_binding_timeout = 300 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] backdoor_port = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] backdoor_socket = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] block_device_allocate_retries = 300 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] block_device_allocate_retries_interval = 5 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cert = self.pem {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] compute_driver = libvirt.LibvirtDriver {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] compute_monitors = [] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] config_dir = [] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] config_drive_format = iso9660 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] config_file = ['/etc/nova/nova-cpu.conf'] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] config_source = [] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] console_host = user {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] control_exchange = nova {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cpu_allocation_ratio = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] daemon = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] debug = True {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] default_access_ip_network_name = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] default_availability_zone = nova {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] default_ephemeral_format = ext4 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] default_schedule_zone = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] disk_allocation_ratio = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] enable_new_services = True {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] enabled_apis = ['osapi_compute'] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] enabled_ssl_apis = [] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] flat_injected = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] force_config_drive = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] force_raw_images = True {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] graceful_shutdown_timeout = 5 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] heal_instance_info_cache_interval = 60 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] host = user {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] initial_cpu_allocation_ratio = 4.0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] initial_disk_allocation_ratio = 1.0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] initial_ram_allocation_ratio = 1.0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] instance_build_timeout = 0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] instance_delete_interval = 300 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] instance_format = [instance: %(uuid)s] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] instance_name_template = instance-%08x {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] instance_usage_audit = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] instance_usage_audit_period = month {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] instances_path = /opt/stack/data/nova/instances {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] internal_service_availability_zone = internal {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] key = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] live_migration_retry_count = 30 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] log_config_append = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] log_dir = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] log_file = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] log_options = True {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] log_rotate_interval = 1 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] log_rotate_interval_type = days {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] log_rotation_type = none {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] long_rpc_timeout = 1800 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] max_concurrent_builds = 10 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] max_concurrent_live_migrations = 1 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] max_concurrent_snapshots = 5 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] max_local_block_devices = 3 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] max_logfile_count = 30 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] max_logfile_size_mb = 200 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] maximum_instance_delete_attempts = 5 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] metadata_listen = 0.0.0.0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] metadata_listen_port = 8775 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] metadata_workers = 3 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] migrate_max_retries = -1 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] mkisofs_cmd = genisoimage {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] my_block_storage_ip = 10.0.0.210 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] my_ip = 10.0.0.210 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] network_allocate_retries = 0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] osapi_compute_listen = 0.0.0.0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] osapi_compute_listen_port = 8774 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] osapi_compute_unique_server_name_scope = {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] osapi_compute_workers = 3 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] password_length = 12 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] periodic_enable = True {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] periodic_fuzzy_delay = 60 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] pointer_model = ps2mouse {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] preallocate_images = none {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] publish_errors = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] pybasedir = /opt/stack/nova {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] ram_allocation_ratio = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] rate_limit_burst = 0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] rate_limit_except_level = CRITICAL {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] rate_limit_interval = 0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] reboot_timeout = 0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] reclaim_instance_interval = 0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] record = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] reimage_timeout_per_gb = 20 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] report_interval = 10 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] rescue_timeout = 0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] reserved_host_cpus = 0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] reserved_host_disk_mb = 0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] reserved_host_memory_mb = 512 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] reserved_huge_pages = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] resize_confirm_window = 0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] resize_fs_using_block_device = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] resume_guests_state_on_host_boot = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] rpc_response_timeout = 60 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] run_external_periodic_tasks = True {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] running_deleted_instance_action = reap {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] running_deleted_instance_poll_interval = 1800 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] running_deleted_instance_timeout = 0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] scheduler_instance_sync_interval = 120 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] service_down_time = 60 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] servicegroup_driver = db {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] shelved_offload_time = 0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] shelved_poll_interval = 3600 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] shutdown_timeout = 0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] source_is_ipv6 = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] ssl_only = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] state_path = /opt/stack/data/nova {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] sync_power_state_interval = 600 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] sync_power_state_pool_size = 1000 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] syslog_log_facility = LOG_USER {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] tempdir = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] timeout_nbd = 10 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] transport_url = **** {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] update_resources_interval = 0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] use_cow_images = True {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] use_eventlog = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] use_journal = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] use_json = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] use_rootwrap_daemon = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] use_stderr = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] use_syslog = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vcpu_pin_set = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vif_plugging_is_fatal = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vif_plugging_timeout = 0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] virt_mkfs = [] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] volume_usage_poll_interval = 0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] watch_log_file = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] web = /usr/share/spice-html5 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_concurrency.disable_process_locking = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_concurrency.lock_path = /opt/stack/data/nova {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_metrics.metrics_process_name = {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api.auth_strategy = keystone {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api.compute_link_prefix = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api.dhcp_domain = novalocal {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api.enable_instance_password = True {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api.glance_link_prefix = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api.instance_list_cells_batch_strategy = distributed {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api.instance_list_per_project_cells = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api.list_records_by_skipping_down_cells = True {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api.local_metadata_per_cell = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api.max_limit = 1000 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api.metadata_cache_expiration = 15 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api.neutron_default_tenant_id = default {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api.use_forwarded_for = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api.use_neutron_default_nets = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api.vendordata_dynamic_failure_fatal = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api.vendordata_dynamic_ssl_certfile = {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api.vendordata_dynamic_targets = [] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api.vendordata_jsonfile_path = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api.vendordata_providers = ['StaticJSON'] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cache.backend = dogpile.cache.memcached {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cache.backend_argument = **** {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cache.config_prefix = cache.oslo {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cache.dead_timeout = 60.0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cache.debug_cache_backend = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cache.enable_retry_client = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cache.enable_socket_keepalive = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cache.enabled = True {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cache.expiration_time = 600 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cache.hashclient_retry_attempts = 2 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cache.hashclient_retry_delay = 1.0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cache.memcache_dead_retry = 300 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cache.memcache_password = {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cache.memcache_pool_maxsize = 10 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cache.memcache_pool_unused_timeout = 60 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cache.memcache_sasl_enabled = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cache.memcache_servers = ['localhost:11211'] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cache.memcache_socket_timeout = 1.0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cache.memcache_username = {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cache.proxies = [] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cache.retry_attempts = 2 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cache.retry_delay = 0.0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cache.socket_keepalive_count = 1 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cache.socket_keepalive_idle = 1 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cache.socket_keepalive_interval = 1 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cache.tls_allowed_ciphers = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cache.tls_cafile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cache.tls_certfile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cache.tls_enabled = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cache.tls_keyfile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cinder.auth_section = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cinder.auth_type = password {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cinder.cafile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cinder.catalog_info = volumev3::publicURL {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cinder.certfile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cinder.collect_timing = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cinder.cross_az_attach = True {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cinder.debug = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cinder.endpoint_template = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cinder.http_retries = 3 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cinder.insecure = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cinder.keyfile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cinder.os_region_name = RegionOne {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cinder.split_loggers = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cinder.timeout = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] compute.cpu_dedicated_set = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] compute.cpu_shared_set = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] compute.image_type_exclude_list = [] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] compute.live_migration_wait_for_vif_plug = True {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] compute.max_concurrent_disk_ops = 0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] compute.max_disk_devices_to_attach = -1 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] compute.resource_provider_association_refresh = 300 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] compute.shutdown_retry_interval = 10 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] conductor.workers = 3 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] console.allowed_origins = [] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] console.ssl_ciphers = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] console.ssl_minimum_version = default {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] consoleauth.token_ttl = 600 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cyborg.cafile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cyborg.certfile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cyborg.collect_timing = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cyborg.connect_retries = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cyborg.connect_retry_delay = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cyborg.endpoint_override = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cyborg.insecure = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cyborg.keyfile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cyborg.max_version = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cyborg.min_version = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cyborg.region_name = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cyborg.service_name = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cyborg.service_type = accelerator {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cyborg.split_loggers = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cyborg.status_code_retries = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cyborg.status_code_retry_delay = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cyborg.timeout = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] cyborg.version = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] database.backend = sqlalchemy {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] database.connection = **** {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] database.connection_debug = 0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] database.connection_parameters = {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] database.connection_recycle_time = 3600 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] database.connection_trace = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] database.db_inc_retry_interval = True {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] database.db_max_retries = 20 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] database.db_max_retry_interval = 10 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] database.db_retry_interval = 1 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] database.max_overflow = 50 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] database.max_pool_size = 5 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] database.max_retries = 10 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] database.mysql_enable_ndb = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] database.mysql_sql_mode = TRADITIONAL {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] database.mysql_wsrep_sync_wait = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] database.pool_timeout = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] database.retry_interval = 10 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] database.slave_connection = **** {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] database.sqlite_synchronous = True {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api_database.backend = sqlalchemy {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api_database.connection = **** {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api_database.connection_debug = 0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api_database.connection_parameters = {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api_database.connection_recycle_time = 3600 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api_database.connection_trace = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api_database.db_inc_retry_interval = True {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api_database.db_max_retries = 20 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api_database.db_max_retry_interval = 10 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api_database.db_retry_interval = 1 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api_database.max_overflow = 50 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api_database.max_pool_size = 5 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api_database.max_retries = 10 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api_database.mysql_enable_ndb = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api_database.mysql_wsrep_sync_wait = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api_database.pool_timeout = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api_database.retry_interval = 10 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api_database.slave_connection = **** {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] api_database.sqlite_synchronous = True {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] devices.enabled_mdev_types = [] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] ephemeral_storage_encryption.enabled = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] ephemeral_storage_encryption.key_size = 512 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] glance.api_servers = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] glance.cafile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] glance.certfile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] glance.collect_timing = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] glance.connect_retries = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] glance.connect_retry_delay = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] glance.debug = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] glance.default_trusted_certificate_ids = [] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] glance.enable_certificate_validation = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] glance.enable_rbd_download = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] glance.endpoint_override = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] glance.insecure = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] glance.keyfile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] glance.max_version = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] glance.min_version = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] glance.num_retries = 3 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] glance.rbd_ceph_conf = {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] glance.rbd_connect_timeout = 5 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] glance.rbd_pool = {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] glance.rbd_user = {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] glance.region_name = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] glance.service_name = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] glance.service_type = image {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] glance.split_loggers = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] glance.status_code_retries = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] glance.status_code_retry_delay = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] glance.timeout = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] glance.verify_glance_signatures = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] glance.version = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] guestfs.debug = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] hyperv.config_drive_cdrom = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] hyperv.config_drive_inject_password = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] hyperv.enable_instance_metrics_collection = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] hyperv.enable_remotefx = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] hyperv.instances_path_share = {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] hyperv.iscsi_initiator_list = [] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] hyperv.limit_cpu_features = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] hyperv.power_state_check_timeframe = 60 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] hyperv.power_state_event_polling_interval = 2 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] hyperv.use_multipath_io = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] hyperv.volume_attach_retry_count = 10 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] hyperv.volume_attach_retry_interval = 5 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] hyperv.vswitch_name = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] mks.enabled = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] image_cache.manager_interval = 2400 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] image_cache.precache_concurrency = 1 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] image_cache.remove_unused_base_images = True {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] image_cache.subdirectory_name = _base {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] ironic.api_max_retries = 60 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] ironic.api_retry_interval = 2 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] ironic.auth_section = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] ironic.auth_type = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] ironic.cafile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] ironic.certfile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] ironic.collect_timing = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] ironic.connect_retries = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] ironic.connect_retry_delay = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] ironic.endpoint_override = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] ironic.insecure = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] ironic.keyfile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] ironic.max_version = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] ironic.min_version = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] ironic.partition_key = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] ironic.peer_list = [] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] ironic.region_name = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] ironic.serial_console_state_timeout = 10 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] ironic.service_name = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] ironic.service_type = baremetal {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] ironic.split_loggers = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] ironic.status_code_retries = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] ironic.status_code_retry_delay = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] ironic.timeout = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] ironic.version = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] key_manager.fixed_key = **** {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] barbican.barbican_api_version = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] barbican.barbican_endpoint = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] barbican.barbican_endpoint_type = public {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] barbican.barbican_region_name = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] barbican.cafile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] barbican.certfile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] barbican.collect_timing = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] barbican.insecure = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] barbican.keyfile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] barbican.number_of_retries = 60 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] barbican.retry_delay = 1 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] barbican.send_service_user_token = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] barbican.split_loggers = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] barbican.timeout = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] barbican.verify_ssl = True {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] barbican.verify_ssl_path = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] barbican_service_user.auth_section = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] barbican_service_user.auth_type = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] barbican_service_user.cafile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] barbican_service_user.certfile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] barbican_service_user.collect_timing = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] barbican_service_user.insecure = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] barbican_service_user.keyfile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] barbican_service_user.split_loggers = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] barbican_service_user.timeout = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vault.approle_role_id = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vault.approle_secret_id = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vault.cafile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vault.certfile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vault.collect_timing = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vault.insecure = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vault.keyfile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vault.kv_mountpoint = secret {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vault.kv_version = 2 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vault.namespace = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vault.root_token_id = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vault.split_loggers = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vault.ssl_ca_crt_file = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vault.timeout = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vault.use_ssl = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] keystone.cafile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] keystone.certfile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] keystone.collect_timing = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] keystone.connect_retries = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] keystone.connect_retry_delay = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] keystone.endpoint_override = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] keystone.insecure = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] keystone.keyfile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] keystone.max_version = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] keystone.min_version = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] keystone.region_name = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] keystone.service_name = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] keystone.service_type = identity {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] keystone.split_loggers = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] keystone.status_code_retries = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] keystone.status_code_retry_delay = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] keystone.timeout = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] keystone.version = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.connection_uri = {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.cpu_mode = custom {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.cpu_model_extra_flags = [] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: WARNING oslo_config.cfg [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] Deprecated: Option "cpu_model" from group "libvirt" is deprecated. Use option "cpu_models" from group "libvirt". Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.cpu_models = ['Nehalem'] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.cpu_power_governor_high = performance {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.cpu_power_governor_low = powersave {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.cpu_power_management = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.device_detach_attempts = 8 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.device_detach_timeout = 20 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.disk_cachemodes = [] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.disk_prefix = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.enabled_perf_events = [] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.file_backed_memory = 0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.gid_maps = [] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.hw_disk_discard = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.hw_machine_type = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.images_rbd_ceph_conf = {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.images_rbd_glance_store_name = {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.images_rbd_pool = rbd {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.images_type = default {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.images_volume_group = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.inject_key = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.inject_partition = -2 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.inject_password = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.iscsi_iface = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.iser_use_multipath = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.live_migration_bandwidth = 0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.live_migration_completion_timeout = 800 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.live_migration_downtime = 500 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.live_migration_downtime_delay = 75 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.live_migration_downtime_steps = 10 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.live_migration_inbound_addr = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.live_migration_permit_auto_converge = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.live_migration_permit_post_copy = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.live_migration_scheme = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.live_migration_timeout_action = abort {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.live_migration_tunnelled = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: WARNING oslo_config.cfg [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Apr 18 16:09:52 user nova-compute[70975]: live_migration_uri is deprecated for removal in favor of two other options that Apr 18 16:09:52 user nova-compute[70975]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Apr 18 16:09:52 user nova-compute[70975]: and ``live_migration_inbound_addr`` respectively. Apr 18 16:09:52 user nova-compute[70975]: ). Its value may be silently ignored in the future. Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.live_migration_uri = qemu+ssh://stack@%s/system {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.live_migration_with_native_tls = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.max_queues = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.mem_stats_period_seconds = 10 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.nfs_mount_options = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.nfs_mount_point_base = /opt/stack/data/nova/mnt {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.num_aoe_discover_tries = 3 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.num_iser_scan_tries = 5 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.num_memory_encrypted_guests = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.num_nvme_discover_tries = 5 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.num_pcie_ports = 0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.num_volume_scan_tries = 5 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.pmem_namespaces = [] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.quobyte_client_cfg = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.quobyte_mount_point_base = /opt/stack/data/nova/mnt {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.rbd_connect_timeout = 5 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.rbd_secret_uuid = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.rbd_user = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.realtime_scheduler_priority = 1 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.remote_filesystem_transport = ssh {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.rescue_image_id = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.rescue_kernel_id = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.rescue_ramdisk_id = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.rng_dev_path = /dev/urandom {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.rx_queue_size = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.smbfs_mount_options = {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.smbfs_mount_point_base = /opt/stack/data/nova/mnt {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.snapshot_compression = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.snapshot_image_format = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.sparse_logical_volumes = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.swtpm_enabled = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.swtpm_group = tss {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.swtpm_user = tss {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.sysinfo_serial = unique {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.tx_queue_size = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.uid_maps = [] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.use_virtio_for_bridges = True {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.virt_type = kvm {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.volume_clear = zero {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.volume_clear_size = 0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.volume_use_multipath = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.vzstorage_cache_path = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.vzstorage_mount_group = qemu {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.vzstorage_mount_opts = [] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/nova/mnt {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.vzstorage_mount_user = stack {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] neutron.auth_section = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] neutron.auth_type = password {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] neutron.cafile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] neutron.certfile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] neutron.collect_timing = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] neutron.connect_retries = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] neutron.connect_retry_delay = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] neutron.default_floating_pool = public {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] neutron.endpoint_override = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] neutron.extension_sync_interval = 600 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] neutron.http_retries = 3 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] neutron.insecure = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] neutron.keyfile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] neutron.max_version = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] neutron.metadata_proxy_shared_secret = **** {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] neutron.min_version = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] neutron.ovs_bridge = br-int {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] neutron.physnets = [] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] neutron.region_name = RegionOne {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] neutron.service_metadata_proxy = True {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] neutron.service_name = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] neutron.service_type = network {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] neutron.split_loggers = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] neutron.status_code_retries = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] neutron.status_code_retry_delay = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] neutron.timeout = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] neutron.version = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] notifications.bdms_in_notifications = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] notifications.default_level = INFO {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] notifications.notification_format = unversioned {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] notifications.notify_on_state_change = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] pci.alias = [] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] pci.device_spec = [] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] pci.report_in_placement = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] placement.auth_section = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] placement.auth_type = password {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] placement.auth_url = http://10.0.0.210/identity {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] placement.cafile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] placement.certfile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] placement.collect_timing = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] placement.connect_retries = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] placement.connect_retry_delay = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] placement.default_domain_id = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] placement.default_domain_name = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] placement.domain_id = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] placement.domain_name = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] placement.endpoint_override = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] placement.insecure = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] placement.keyfile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] placement.max_version = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] placement.min_version = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] placement.password = **** {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] placement.project_domain_id = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] placement.project_domain_name = Default {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] placement.project_id = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] placement.project_name = service {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] placement.region_name = RegionOne {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] placement.service_name = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] placement.service_type = placement {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] placement.split_loggers = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] placement.status_code_retries = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] placement.status_code_retry_delay = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] placement.system_scope = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] placement.timeout = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] placement.trust_id = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] placement.user_domain_id = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] placement.user_domain_name = Default {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] placement.user_id = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] placement.username = placement {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] placement.version = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] quota.cores = 20 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] quota.count_usage_from_placement = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] quota.injected_file_content_bytes = 10240 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] quota.injected_file_path_length = 255 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] quota.injected_files = 5 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] quota.instances = 10 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] quota.key_pairs = 100 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] quota.metadata_items = 128 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] quota.ram = 51200 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] quota.recheck_quota = True {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] quota.server_group_members = 10 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] quota.server_groups = 10 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] rdp.enabled = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] scheduler.image_metadata_prefilter = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] scheduler.max_attempts = 3 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] scheduler.max_placement_results = 1000 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] scheduler.query_placement_for_availability_zone = True {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] scheduler.query_placement_for_image_type_support = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] scheduler.workers = 3 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] filter_scheduler.enabled_filters = ['AvailabilityZoneFilter', 'ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] filter_scheduler.host_subset_size = 1 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] filter_scheduler.image_properties_default_architecture = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] filter_scheduler.isolated_hosts = [] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] filter_scheduler.isolated_images = [] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] filter_scheduler.max_instances_per_host = 50 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] filter_scheduler.pci_in_placement = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] filter_scheduler.track_instance_changes = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] metrics.required = True {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] metrics.weight_multiplier = 1.0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] metrics.weight_of_unavailable = -10000.0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] metrics.weight_setting = [] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] serial_console.enabled = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] serial_console.port_range = 10000:20000 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] serial_console.serialproxy_port = 6083 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] service_user.auth_section = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] service_user.auth_type = password {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] service_user.cafile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] service_user.certfile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] service_user.collect_timing = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] service_user.insecure = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] service_user.keyfile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] service_user.send_service_user_token = True {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] service_user.split_loggers = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] service_user.timeout = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] spice.agent_enabled = True {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] spice.enabled = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] spice.html5proxy_base_url = http://10.0.0.210:6081/spice_auto.html {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] spice.html5proxy_host = 0.0.0.0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] spice.html5proxy_port = 6082 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] spice.image_compression = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] spice.jpeg_compression = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] spice.playback_compression = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] spice.server_listen = 127.0.0.1 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] spice.streaming_mode = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] spice.zlib_compression = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] upgrade_levels.baseapi = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] upgrade_levels.cert = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] upgrade_levels.compute = auto {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] upgrade_levels.conductor = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] upgrade_levels.scheduler = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vendordata_dynamic_auth.auth_section = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vendordata_dynamic_auth.auth_type = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vendordata_dynamic_auth.cafile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vendordata_dynamic_auth.certfile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vendordata_dynamic_auth.collect_timing = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vendordata_dynamic_auth.insecure = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vendordata_dynamic_auth.keyfile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vendordata_dynamic_auth.split_loggers = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vendordata_dynamic_auth.timeout = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vmware.api_retry_count = 10 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vmware.ca_file = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vmware.cache_prefix = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vmware.cluster_name = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vmware.connection_pool_size = 10 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vmware.console_delay_seconds = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vmware.datastore_regex = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vmware.host_ip = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vmware.host_password = **** {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vmware.host_port = 443 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vmware.host_username = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vmware.insecure = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vmware.integration_bridge = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vmware.maximum_objects = 100 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vmware.pbm_default_policy = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vmware.pbm_enabled = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vmware.pbm_wsdl_location = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vmware.serial_port_proxy_uri = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vmware.serial_port_service_uri = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vmware.task_poll_interval = 0.5 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vmware.use_linked_clone = True {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vmware.vnc_keymap = en-us {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vmware.vnc_port = 5900 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vmware.vnc_port_total = 10000 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vnc.auth_schemes = ['none'] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vnc.enabled = True {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vnc.novncproxy_base_url = http://10.0.0.210:6080/vnc_lite.html {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vnc.novncproxy_port = 6080 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vnc.server_listen = 0.0.0.0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vnc.server_proxyclient_address = 10.0.0.210 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vnc.vencrypt_ca_certs = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vnc.vencrypt_client_cert = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vnc.vencrypt_client_key = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] workarounds.disable_fallback_pcpu_query = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] workarounds.disable_group_policy_check_upcall = True {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] workarounds.disable_rootwrap = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] workarounds.enable_numa_live_migration = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] workarounds.handle_virt_lifecycle_events = True {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] workarounds.libvirt_disable_apic = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] workarounds.never_download_image_if_on_rbd = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] wsgi.client_socket_timeout = 900 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] wsgi.default_pool_size = 1000 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] wsgi.keep_alive = True {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] wsgi.max_header_line = 16384 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] wsgi.secure_proxy_ssl_header = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] wsgi.ssl_ca_file = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] wsgi.ssl_cert_file = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] wsgi.ssl_key_file = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] wsgi.tcp_keepidle = 600 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] zvm.ca_file = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] zvm.cloud_connector_url = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] zvm.image_tmp_path = /opt/stack/data/nova/images {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] zvm.reachable_timeout = 300 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_policy.enforce_new_defaults = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_policy.enforce_scope = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_policy.policy_default_rule = default {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_policy.policy_file = policy.yaml {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] profiler.connection_string = messaging:// {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] profiler.enabled = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] profiler.es_doc_type = notification {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] profiler.es_scroll_size = 10000 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] profiler.es_scroll_time = 2m {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] profiler.filter_error_trace = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] profiler.hmac_keys = SECRET_KEY {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] profiler.sentinel_service_name = mymaster {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] profiler.socket_timeout = 0.1 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] profiler.trace_sqlalchemy = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] remote_debug.host = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] remote_debug.port = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_rabbit.rabbit_quroum_max_memory_bytes = 0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_rabbit.rabbit_quroum_max_memory_length = 0 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_rabbit.ssl = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_rabbit.ssl_version = {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_notifications.retry = -1 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_messaging_notifications.transport_url = **** {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_limit.auth_section = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_limit.auth_type = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_limit.cafile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_limit.certfile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_limit.collect_timing = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_limit.connect_retries = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_limit.connect_retry_delay = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_limit.endpoint_id = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_limit.endpoint_override = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_limit.insecure = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_limit.keyfile = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_limit.max_version = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_limit.min_version = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_limit.region_name = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_limit.service_name = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_limit.service_type = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_limit.split_loggers = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_limit.status_code_retries = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_limit.status_code_retry_delay = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_limit.timeout = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_limit.valid_interfaces = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_limit.version = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_reports.file_event_handler = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_reports.file_event_handler_interval = 1 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] oslo_reports.log_dir = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vif_plug_linux_bridge_privileged.group = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vif_plug_linux_bridge_privileged.thread_pool_size = 12 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vif_plug_linux_bridge_privileged.user = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vif_plug_ovs_privileged.group = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vif_plug_ovs_privileged.helper_command = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vif_plug_ovs_privileged.thread_pool_size = 12 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] vif_plug_ovs_privileged.user = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] os_vif_linux_bridge.flat_interface = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:52 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:53 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:53 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:53 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:53 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:53 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:53 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] os_vif_linux_bridge.vlan_interface = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:53 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] os_vif_ovs.isolate_vif = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:53 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:53 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:53 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:53 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] os_vif_ovs.ovsdb_interface = native {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:53 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] os_vif_ovs.per_port_bridge = False {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:53 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] os_brick.lock_path = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:53 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] privsep_osbrick.capabilities = [21] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:53 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] privsep_osbrick.group = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:53 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] privsep_osbrick.helper_command = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:53 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:53 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] privsep_osbrick.thread_pool_size = 12 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:53 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] privsep_osbrick.user = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:53 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:53 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] nova_sys_admin.group = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:53 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] nova_sys_admin.helper_command = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:53 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:53 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] nova_sys_admin.thread_pool_size = 12 {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:53 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] nova_sys_admin.user = None {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 18 16:09:53 user nova-compute[70975]: DEBUG oslo_service.service [None req-039ba384-03de-42d5-b0c5-d215dbe8c01d None None] ******************************************************************************** {{(pid=70975) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} Apr 18 16:09:53 user nova-compute[70975]: INFO nova.service [-] Starting compute node (version 0.0.0) Apr 18 16:09:53 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Starting native event thread {{(pid=70975) _init_events /opt/stack/nova/nova/virt/libvirt/host.py:492}} Apr 18 16:09:53 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Starting green dispatch thread {{(pid=70975) _init_events /opt/stack/nova/nova/virt/libvirt/host.py:498}} Apr 18 16:09:53 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Starting connection event dispatch thread {{(pid=70975) initialize /opt/stack/nova/nova/virt/libvirt/host.py:620}} Apr 18 16:09:53 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Connecting to libvirt: qemu:///system {{(pid=70975) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:503}} Apr 18 16:09:53 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Registering for lifecycle events {{(pid=70975) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:509}} Apr 18 16:09:53 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Registering for connection events: {{(pid=70975) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:530}} Apr 18 16:09:53 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Connection event '1' reason 'None' Apr 18 16:09:53 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Cannot update service status on host "user" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host user could not be found. Apr 18 16:09:53 user nova-compute[70975]: DEBUG nova.virt.libvirt.volume.mount [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Initialising _HostMountState generation 0 {{(pid=70975) host_up /opt/stack/nova/nova/virt/libvirt/volume/mount.py:130}} Apr 18 16:10:00 user nova-compute[70975]: INFO nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Libvirt host capabilities Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: e20c3142-5af9-7467-ecd8-70b2e4a210d6 Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: x86_64 Apr 18 16:10:00 user nova-compute[70975]: IvyBridge-IBRS Apr 18 16:10:00 user nova-compute[70975]: Intel Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: tcp Apr 18 16:10:00 user nova-compute[70975]: rdma Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: 8189224 Apr 18 16:10:00 user nova-compute[70975]: 2047306 Apr 18 16:10:00 user nova-compute[70975]: 0 Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: 8218764 Apr 18 16:10:00 user nova-compute[70975]: 2054691 Apr 18 16:10:00 user nova-compute[70975]: 0 Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: apparmor Apr 18 16:10:00 user nova-compute[70975]: 0 Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: dac Apr 18 16:10:00 user nova-compute[70975]: 0 Apr 18 16:10:00 user nova-compute[70975]: +64055:+108 Apr 18 16:10:00 user nova-compute[70975]: +64055:+108 Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: hvm Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: 64 Apr 18 16:10:00 user nova-compute[70975]: /usr/bin/qemu-system-alpha Apr 18 16:10:00 user nova-compute[70975]: clipper Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: hvm Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: 32 Apr 18 16:10:00 user nova-compute[70975]: /usr/bin/qemu-system-arm Apr 18 16:10:00 user nova-compute[70975]: integratorcp Apr 18 16:10:00 user nova-compute[70975]: ast2600-evb Apr 18 16:10:00 user nova-compute[70975]: borzoi Apr 18 16:10:00 user nova-compute[70975]: spitz Apr 18 16:10:00 user nova-compute[70975]: virt-2.7 Apr 18 16:10:00 user nova-compute[70975]: nuri Apr 18 16:10:00 user nova-compute[70975]: mcimx7d-sabre Apr 18 16:10:00 user nova-compute[70975]: romulus-bmc Apr 18 16:10:00 user nova-compute[70975]: virt-3.0 Apr 18 16:10:00 user nova-compute[70975]: virt-5.0 Apr 18 16:10:00 user nova-compute[70975]: npcm750-evb Apr 18 16:10:00 user nova-compute[70975]: virt-2.10 Apr 18 16:10:00 user nova-compute[70975]: rainier-bmc Apr 18 16:10:00 user nova-compute[70975]: mps3-an547 Apr 18 16:10:00 user nova-compute[70975]: musca-b1 Apr 18 16:10:00 user nova-compute[70975]: realview-pbx-a9 Apr 18 16:10:00 user nova-compute[70975]: versatileab Apr 18 16:10:00 user nova-compute[70975]: kzm Apr 18 16:10:00 user nova-compute[70975]: virt-2.8 Apr 18 16:10:00 user nova-compute[70975]: musca-a Apr 18 16:10:00 user nova-compute[70975]: virt-3.1 Apr 18 16:10:00 user nova-compute[70975]: mcimx6ul-evk Apr 18 16:10:00 user nova-compute[70975]: virt-5.1 Apr 18 16:10:00 user nova-compute[70975]: smdkc210 Apr 18 16:10:00 user nova-compute[70975]: sx1 Apr 18 16:10:00 user nova-compute[70975]: virt-2.11 Apr 18 16:10:00 user nova-compute[70975]: imx25-pdk Apr 18 16:10:00 user nova-compute[70975]: stm32vldiscovery Apr 18 16:10:00 user nova-compute[70975]: virt-2.9 Apr 18 16:10:00 user nova-compute[70975]: orangepi-pc Apr 18 16:10:00 user nova-compute[70975]: quanta-q71l-bmc Apr 18 16:10:00 user nova-compute[70975]: z2 Apr 18 16:10:00 user nova-compute[70975]: virt-5.2 Apr 18 16:10:00 user nova-compute[70975]: xilinx-zynq-a9 Apr 18 16:10:00 user nova-compute[70975]: tosa Apr 18 16:10:00 user nova-compute[70975]: mps2-an500 Apr 18 16:10:00 user nova-compute[70975]: virt-2.12 Apr 18 16:10:00 user nova-compute[70975]: mps2-an521 Apr 18 16:10:00 user nova-compute[70975]: sabrelite Apr 18 16:10:00 user nova-compute[70975]: mps2-an511 Apr 18 16:10:00 user nova-compute[70975]: canon-a1100 Apr 18 16:10:00 user nova-compute[70975]: realview-eb Apr 18 16:10:00 user nova-compute[70975]: quanta-gbs-bmc Apr 18 16:10:00 user nova-compute[70975]: emcraft-sf2 Apr 18 16:10:00 user nova-compute[70975]: realview-pb-a8 Apr 18 16:10:00 user nova-compute[70975]: virt-4.0 Apr 18 16:10:00 user nova-compute[70975]: raspi1ap Apr 18 16:10:00 user nova-compute[70975]: palmetto-bmc Apr 18 16:10:00 user nova-compute[70975]: sx1-v1 Apr 18 16:10:00 user nova-compute[70975]: n810 Apr 18 16:10:00 user nova-compute[70975]: g220a-bmc Apr 18 16:10:00 user nova-compute[70975]: n800 Apr 18 16:10:00 user nova-compute[70975]: tacoma-bmc Apr 18 16:10:00 user nova-compute[70975]: virt-4.1 Apr 18 16:10:00 user nova-compute[70975]: quanta-gsj Apr 18 16:10:00 user nova-compute[70975]: versatilepb Apr 18 16:10:00 user nova-compute[70975]: terrier Apr 18 16:10:00 user nova-compute[70975]: mainstone Apr 18 16:10:00 user nova-compute[70975]: realview-eb-mpcore Apr 18 16:10:00 user nova-compute[70975]: supermicrox11-bmc Apr 18 16:10:00 user nova-compute[70975]: virt-4.2 Apr 18 16:10:00 user nova-compute[70975]: witherspoon-bmc Apr 18 16:10:00 user nova-compute[70975]: mps3-an524 Apr 18 16:10:00 user nova-compute[70975]: swift-bmc Apr 18 16:10:00 user nova-compute[70975]: kudo-bmc Apr 18 16:10:00 user nova-compute[70975]: vexpress-a9 Apr 18 16:10:00 user nova-compute[70975]: midway Apr 18 16:10:00 user nova-compute[70975]: musicpal Apr 18 16:10:00 user nova-compute[70975]: lm3s811evb Apr 18 16:10:00 user nova-compute[70975]: lm3s6965evb Apr 18 16:10:00 user nova-compute[70975]: microbit Apr 18 16:10:00 user nova-compute[70975]: mps2-an505 Apr 18 16:10:00 user nova-compute[70975]: mps2-an385 Apr 18 16:10:00 user nova-compute[70975]: virt-6.0 Apr 18 16:10:00 user nova-compute[70975]: cubieboard Apr 18 16:10:00 user nova-compute[70975]: verdex Apr 18 16:10:00 user nova-compute[70975]: netduino2 Apr 18 16:10:00 user nova-compute[70975]: mps2-an386 Apr 18 16:10:00 user nova-compute[70975]: virt-6.1 Apr 18 16:10:00 user nova-compute[70975]: raspi2b Apr 18 16:10:00 user nova-compute[70975]: vexpress-a15 Apr 18 16:10:00 user nova-compute[70975]: fuji-bmc Apr 18 16:10:00 user nova-compute[70975]: virt-6.2 Apr 18 16:10:00 user nova-compute[70975]: virt Apr 18 16:10:00 user nova-compute[70975]: sonorapass-bmc Apr 18 16:10:00 user nova-compute[70975]: cheetah Apr 18 16:10:00 user nova-compute[70975]: virt-2.6 Apr 18 16:10:00 user nova-compute[70975]: ast2500-evb Apr 18 16:10:00 user nova-compute[70975]: highbank Apr 18 16:10:00 user nova-compute[70975]: akita Apr 18 16:10:00 user nova-compute[70975]: connex Apr 18 16:10:00 user nova-compute[70975]: netduinoplus2 Apr 18 16:10:00 user nova-compute[70975]: collie Apr 18 16:10:00 user nova-compute[70975]: raspi0 Apr 18 16:10:00 user nova-compute[70975]: fp5280g2-bmc Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: hvm Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: 32 Apr 18 16:10:00 user nova-compute[70975]: /usr/bin/qemu-system-arm Apr 18 16:10:00 user nova-compute[70975]: integratorcp Apr 18 16:10:00 user nova-compute[70975]: ast2600-evb Apr 18 16:10:00 user nova-compute[70975]: borzoi Apr 18 16:10:00 user nova-compute[70975]: spitz Apr 18 16:10:00 user nova-compute[70975]: virt-2.7 Apr 18 16:10:00 user nova-compute[70975]: nuri Apr 18 16:10:00 user nova-compute[70975]: mcimx7d-sabre Apr 18 16:10:00 user nova-compute[70975]: romulus-bmc Apr 18 16:10:00 user nova-compute[70975]: virt-3.0 Apr 18 16:10:00 user nova-compute[70975]: virt-5.0 Apr 18 16:10:00 user nova-compute[70975]: npcm750-evb Apr 18 16:10:00 user nova-compute[70975]: virt-2.10 Apr 18 16:10:00 user nova-compute[70975]: rainier-bmc Apr 18 16:10:00 user nova-compute[70975]: mps3-an547 Apr 18 16:10:00 user nova-compute[70975]: musca-b1 Apr 18 16:10:00 user nova-compute[70975]: realview-pbx-a9 Apr 18 16:10:00 user nova-compute[70975]: versatileab Apr 18 16:10:00 user nova-compute[70975]: kzm Apr 18 16:10:00 user nova-compute[70975]: virt-2.8 Apr 18 16:10:00 user nova-compute[70975]: musca-a Apr 18 16:10:00 user nova-compute[70975]: virt-3.1 Apr 18 16:10:00 user nova-compute[70975]: mcimx6ul-evk Apr 18 16:10:00 user nova-compute[70975]: virt-5.1 Apr 18 16:10:00 user nova-compute[70975]: smdkc210 Apr 18 16:10:00 user nova-compute[70975]: sx1 Apr 18 16:10:00 user nova-compute[70975]: virt-2.11 Apr 18 16:10:00 user nova-compute[70975]: imx25-pdk Apr 18 16:10:00 user nova-compute[70975]: stm32vldiscovery Apr 18 16:10:00 user nova-compute[70975]: virt-2.9 Apr 18 16:10:00 user nova-compute[70975]: orangepi-pc Apr 18 16:10:00 user nova-compute[70975]: quanta-q71l-bmc Apr 18 16:10:00 user nova-compute[70975]: z2 Apr 18 16:10:00 user nova-compute[70975]: virt-5.2 Apr 18 16:10:00 user nova-compute[70975]: xilinx-zynq-a9 Apr 18 16:10:00 user nova-compute[70975]: tosa Apr 18 16:10:00 user nova-compute[70975]: mps2-an500 Apr 18 16:10:00 user nova-compute[70975]: virt-2.12 Apr 18 16:10:00 user nova-compute[70975]: mps2-an521 Apr 18 16:10:00 user nova-compute[70975]: sabrelite Apr 18 16:10:00 user nova-compute[70975]: mps2-an511 Apr 18 16:10:00 user nova-compute[70975]: canon-a1100 Apr 18 16:10:00 user nova-compute[70975]: realview-eb Apr 18 16:10:00 user nova-compute[70975]: quanta-gbs-bmc Apr 18 16:10:00 user nova-compute[70975]: emcraft-sf2 Apr 18 16:10:00 user nova-compute[70975]: realview-pb-a8 Apr 18 16:10:00 user nova-compute[70975]: virt-4.0 Apr 18 16:10:00 user nova-compute[70975]: raspi1ap Apr 18 16:10:00 user nova-compute[70975]: palmetto-bmc Apr 18 16:10:00 user nova-compute[70975]: sx1-v1 Apr 18 16:10:00 user nova-compute[70975]: n810 Apr 18 16:10:00 user nova-compute[70975]: g220a-bmc Apr 18 16:10:00 user nova-compute[70975]: n800 Apr 18 16:10:00 user nova-compute[70975]: tacoma-bmc Apr 18 16:10:00 user nova-compute[70975]: virt-4.1 Apr 18 16:10:00 user nova-compute[70975]: quanta-gsj Apr 18 16:10:00 user nova-compute[70975]: versatilepb Apr 18 16:10:00 user nova-compute[70975]: terrier Apr 18 16:10:00 user nova-compute[70975]: mainstone Apr 18 16:10:00 user nova-compute[70975]: realview-eb-mpcore Apr 18 16:10:00 user nova-compute[70975]: supermicrox11-bmc Apr 18 16:10:00 user nova-compute[70975]: virt-4.2 Apr 18 16:10:00 user nova-compute[70975]: witherspoon-bmc Apr 18 16:10:00 user nova-compute[70975]: mps3-an524 Apr 18 16:10:00 user nova-compute[70975]: swift-bmc Apr 18 16:10:00 user nova-compute[70975]: kudo-bmc Apr 18 16:10:00 user nova-compute[70975]: vexpress-a9 Apr 18 16:10:00 user nova-compute[70975]: midway Apr 18 16:10:00 user nova-compute[70975]: musicpal Apr 18 16:10:00 user nova-compute[70975]: lm3s811evb Apr 18 16:10:00 user nova-compute[70975]: lm3s6965evb Apr 18 16:10:00 user nova-compute[70975]: microbit Apr 18 16:10:00 user nova-compute[70975]: mps2-an505 Apr 18 16:10:00 user nova-compute[70975]: mps2-an385 Apr 18 16:10:00 user nova-compute[70975]: virt-6.0 Apr 18 16:10:00 user nova-compute[70975]: cubieboard Apr 18 16:10:00 user nova-compute[70975]: verdex Apr 18 16:10:00 user nova-compute[70975]: netduino2 Apr 18 16:10:00 user nova-compute[70975]: mps2-an386 Apr 18 16:10:00 user nova-compute[70975]: virt-6.1 Apr 18 16:10:00 user nova-compute[70975]: raspi2b Apr 18 16:10:00 user nova-compute[70975]: vexpress-a15 Apr 18 16:10:00 user nova-compute[70975]: fuji-bmc Apr 18 16:10:00 user nova-compute[70975]: virt-6.2 Apr 18 16:10:00 user nova-compute[70975]: virt Apr 18 16:10:00 user nova-compute[70975]: sonorapass-bmc Apr 18 16:10:00 user nova-compute[70975]: cheetah Apr 18 16:10:00 user nova-compute[70975]: virt-2.6 Apr 18 16:10:00 user nova-compute[70975]: ast2500-evb Apr 18 16:10:00 user nova-compute[70975]: highbank Apr 18 16:10:00 user nova-compute[70975]: akita Apr 18 16:10:00 user nova-compute[70975]: connex Apr 18 16:10:00 user nova-compute[70975]: netduinoplus2 Apr 18 16:10:00 user nova-compute[70975]: collie Apr 18 16:10:00 user nova-compute[70975]: raspi0 Apr 18 16:10:00 user nova-compute[70975]: fp5280g2-bmc Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: hvm Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: 64 Apr 18 16:10:00 user nova-compute[70975]: /usr/bin/qemu-system-aarch64 Apr 18 16:10:00 user nova-compute[70975]: integratorcp Apr 18 16:10:00 user nova-compute[70975]: ast2600-evb Apr 18 16:10:00 user nova-compute[70975]: borzoi Apr 18 16:10:00 user nova-compute[70975]: spitz Apr 18 16:10:00 user nova-compute[70975]: virt-2.7 Apr 18 16:10:00 user nova-compute[70975]: nuri Apr 18 16:10:00 user nova-compute[70975]: mcimx7d-sabre Apr 18 16:10:00 user nova-compute[70975]: romulus-bmc Apr 18 16:10:00 user nova-compute[70975]: virt-3.0 Apr 18 16:10:00 user nova-compute[70975]: virt-5.0 Apr 18 16:10:00 user nova-compute[70975]: npcm750-evb Apr 18 16:10:00 user nova-compute[70975]: virt-2.10 Apr 18 16:10:00 user nova-compute[70975]: rainier-bmc Apr 18 16:10:00 user nova-compute[70975]: mps3-an547 Apr 18 16:10:00 user nova-compute[70975]: virt-2.8 Apr 18 16:10:00 user nova-compute[70975]: musca-b1 Apr 18 16:10:00 user nova-compute[70975]: realview-pbx-a9 Apr 18 16:10:00 user nova-compute[70975]: versatileab Apr 18 16:10:00 user nova-compute[70975]: kzm Apr 18 16:10:00 user nova-compute[70975]: musca-a Apr 18 16:10:00 user nova-compute[70975]: virt-3.1 Apr 18 16:10:00 user nova-compute[70975]: mcimx6ul-evk Apr 18 16:10:00 user nova-compute[70975]: virt-5.1 Apr 18 16:10:00 user nova-compute[70975]: smdkc210 Apr 18 16:10:00 user nova-compute[70975]: sx1 Apr 18 16:10:00 user nova-compute[70975]: virt-2.11 Apr 18 16:10:00 user nova-compute[70975]: imx25-pdk Apr 18 16:10:00 user nova-compute[70975]: stm32vldiscovery Apr 18 16:10:00 user nova-compute[70975]: virt-2.9 Apr 18 16:10:00 user nova-compute[70975]: orangepi-pc Apr 18 16:10:00 user nova-compute[70975]: quanta-q71l-bmc Apr 18 16:10:00 user nova-compute[70975]: z2 Apr 18 16:10:00 user nova-compute[70975]: virt-5.2 Apr 18 16:10:00 user nova-compute[70975]: xilinx-zynq-a9 Apr 18 16:10:00 user nova-compute[70975]: xlnx-zcu102 Apr 18 16:10:00 user nova-compute[70975]: tosa Apr 18 16:10:00 user nova-compute[70975]: mps2-an500 Apr 18 16:10:00 user nova-compute[70975]: virt-2.12 Apr 18 16:10:00 user nova-compute[70975]: mps2-an521 Apr 18 16:10:00 user nova-compute[70975]: sabrelite Apr 18 16:10:00 user nova-compute[70975]: mps2-an511 Apr 18 16:10:00 user nova-compute[70975]: canon-a1100 Apr 18 16:10:00 user nova-compute[70975]: realview-eb Apr 18 16:10:00 user nova-compute[70975]: quanta-gbs-bmc Apr 18 16:10:00 user nova-compute[70975]: emcraft-sf2 Apr 18 16:10:00 user nova-compute[70975]: realview-pb-a8 Apr 18 16:10:00 user nova-compute[70975]: sbsa-ref Apr 18 16:10:00 user nova-compute[70975]: virt-4.0 Apr 18 16:10:00 user nova-compute[70975]: raspi1ap Apr 18 16:10:00 user nova-compute[70975]: palmetto-bmc Apr 18 16:10:00 user nova-compute[70975]: sx1-v1 Apr 18 16:10:00 user nova-compute[70975]: n810 Apr 18 16:10:00 user nova-compute[70975]: g220a-bmc Apr 18 16:10:00 user nova-compute[70975]: n800 Apr 18 16:10:00 user nova-compute[70975]: tacoma-bmc Apr 18 16:10:00 user nova-compute[70975]: virt-4.1 Apr 18 16:10:00 user nova-compute[70975]: quanta-gsj Apr 18 16:10:00 user nova-compute[70975]: versatilepb Apr 18 16:10:00 user nova-compute[70975]: terrier Apr 18 16:10:00 user nova-compute[70975]: mainstone Apr 18 16:10:00 user nova-compute[70975]: realview-eb-mpcore Apr 18 16:10:00 user nova-compute[70975]: supermicrox11-bmc Apr 18 16:10:00 user nova-compute[70975]: virt-4.2 Apr 18 16:10:00 user nova-compute[70975]: witherspoon-bmc Apr 18 16:10:00 user nova-compute[70975]: mps3-an524 Apr 18 16:10:00 user nova-compute[70975]: swift-bmc Apr 18 16:10:00 user nova-compute[70975]: kudo-bmc Apr 18 16:10:00 user nova-compute[70975]: vexpress-a9 Apr 18 16:10:00 user nova-compute[70975]: midway Apr 18 16:10:00 user nova-compute[70975]: musicpal Apr 18 16:10:00 user nova-compute[70975]: lm3s811evb Apr 18 16:10:00 user nova-compute[70975]: lm3s6965evb Apr 18 16:10:00 user nova-compute[70975]: microbit Apr 18 16:10:00 user nova-compute[70975]: mps2-an505 Apr 18 16:10:00 user nova-compute[70975]: mps2-an385 Apr 18 16:10:00 user nova-compute[70975]: virt-6.0 Apr 18 16:10:00 user nova-compute[70975]: raspi3ap Apr 18 16:10:00 user nova-compute[70975]: cubieboard Apr 18 16:10:00 user nova-compute[70975]: verdex Apr 18 16:10:00 user nova-compute[70975]: netduino2 Apr 18 16:10:00 user nova-compute[70975]: xlnx-versal-virt Apr 18 16:10:00 user nova-compute[70975]: mps2-an386 Apr 18 16:10:00 user nova-compute[70975]: virt-6.1 Apr 18 16:10:00 user nova-compute[70975]: raspi3b Apr 18 16:10:00 user nova-compute[70975]: raspi2b Apr 18 16:10:00 user nova-compute[70975]: vexpress-a15 Apr 18 16:10:00 user nova-compute[70975]: fuji-bmc Apr 18 16:10:00 user nova-compute[70975]: virt-6.2 Apr 18 16:10:00 user nova-compute[70975]: virt Apr 18 16:10:00 user nova-compute[70975]: sonorapass-bmc Apr 18 16:10:00 user nova-compute[70975]: cheetah Apr 18 16:10:00 user nova-compute[70975]: virt-2.6 Apr 18 16:10:00 user nova-compute[70975]: ast2500-evb Apr 18 16:10:00 user nova-compute[70975]: highbank Apr 18 16:10:00 user nova-compute[70975]: akita Apr 18 16:10:00 user nova-compute[70975]: connex Apr 18 16:10:00 user nova-compute[70975]: netduinoplus2 Apr 18 16:10:00 user nova-compute[70975]: collie Apr 18 16:10:00 user nova-compute[70975]: raspi0 Apr 18 16:10:00 user nova-compute[70975]: fp5280g2-bmc Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: hvm Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: 32 Apr 18 16:10:00 user nova-compute[70975]: /usr/bin/qemu-system-cris Apr 18 16:10:00 user nova-compute[70975]: axis-dev88 Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: hvm Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: 32 Apr 18 16:10:00 user nova-compute[70975]: /usr/bin/qemu-system-i386 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-jammy Apr 18 16:10:00 user nova-compute[70975]: ubuntu Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-impish-hpb Apr 18 16:10:00 user nova-compute[70975]: pc-q35-5.2 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-2.12 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-2.0 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-xenial Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-6.2 Apr 18 16:10:00 user nova-compute[70975]: pc Apr 18 16:10:00 user nova-compute[70975]: pc-q35-4.2 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-2.5 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-4.2 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-focal Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-hirsute Apr 18 16:10:00 user nova-compute[70975]: pc-q35-xenial Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-jammy-hpb Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-5.2 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-1.5 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-2.7 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-eoan-hpb Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-zesty Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-disco-hpb Apr 18 16:10:00 user nova-compute[70975]: pc-q35-groovy Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-groovy Apr 18 16:10:00 user nova-compute[70975]: pc-q35-artful Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-2.2 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-trusty Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-eoan-hpb Apr 18 16:10:00 user nova-compute[70975]: pc-q35-focal-hpb Apr 18 16:10:00 user nova-compute[70975]: pc-q35-bionic-hpb Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-artful Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-2.7 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-6.1 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-yakkety Apr 18 16:10:00 user nova-compute[70975]: pc-q35-2.4 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-cosmic-hpb Apr 18 16:10:00 user nova-compute[70975]: pc-q35-2.10 Apr 18 16:10:00 user nova-compute[70975]: x-remote Apr 18 16:10:00 user nova-compute[70975]: pc-q35-5.1 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-1.7 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-2.9 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-2.11 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-3.1 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-6.1 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-4.1 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-jammy Apr 18 16:10:00 user nova-compute[70975]: ubuntu-q35 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-2.4 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-4.1 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-eoan Apr 18 16:10:00 user nova-compute[70975]: pc-q35-jammy-hpb Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-5.1 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-2.9 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-bionic-hpb Apr 18 16:10:00 user nova-compute[70975]: isapc Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-1.4 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-cosmic Apr 18 16:10:00 user nova-compute[70975]: pc-q35-2.6 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-3.1 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-bionic Apr 18 16:10:00 user nova-compute[70975]: pc-q35-disco-hpb Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-cosmic Apr 18 16:10:00 user nova-compute[70975]: pc-q35-2.12 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-bionic Apr 18 16:10:00 user nova-compute[70975]: pc-q35-groovy-hpb Apr 18 16:10:00 user nova-compute[70975]: pc-q35-disco Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-cosmic-hpb Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-2.1 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-wily Apr 18 16:10:00 user nova-compute[70975]: pc-q35-impish Apr 18 16:10:00 user nova-compute[70975]: pc-q35-6.0 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-impish Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-2.6 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-impish-hpb Apr 18 16:10:00 user nova-compute[70975]: pc-q35-hirsute Apr 18 16:10:00 user nova-compute[70975]: pc-q35-4.0.1 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-hirsute-hpb Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-1.6 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-5.0 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-2.8 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-2.10 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-3.0 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-6.0 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-zesty Apr 18 16:10:00 user nova-compute[70975]: pc-q35-4.0 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-focal Apr 18 16:10:00 user nova-compute[70975]: microvm Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-2.3 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-focal-hpb Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-disco Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-4.0 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-groovy-hpb Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-hirsute-hpb Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-5.0 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-6.2 Apr 18 16:10:00 user nova-compute[70975]: q35 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-2.8 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-eoan Apr 18 16:10:00 user nova-compute[70975]: pc-q35-2.5 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-3.0 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-yakkety Apr 18 16:10:00 user nova-compute[70975]: pc-q35-2.11 Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: hvm Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: 32 Apr 18 16:10:00 user nova-compute[70975]: /usr/bin/qemu-system-m68k Apr 18 16:10:00 user nova-compute[70975]: mcf5208evb Apr 18 16:10:00 user nova-compute[70975]: an5206 Apr 18 16:10:00 user nova-compute[70975]: virt-6.0 Apr 18 16:10:00 user nova-compute[70975]: q800 Apr 18 16:10:00 user nova-compute[70975]: virt-6.2 Apr 18 16:10:00 user nova-compute[70975]: virt Apr 18 16:10:00 user nova-compute[70975]: next-cube Apr 18 16:10:00 user nova-compute[70975]: virt-6.1 Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: hvm Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: 32 Apr 18 16:10:00 user nova-compute[70975]: /usr/bin/qemu-system-microblaze Apr 18 16:10:00 user nova-compute[70975]: petalogix-s3adsp1800 Apr 18 16:10:00 user nova-compute[70975]: petalogix-ml605 Apr 18 16:10:00 user nova-compute[70975]: xlnx-zynqmp-pmu Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: hvm Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: 32 Apr 18 16:10:00 user nova-compute[70975]: /usr/bin/qemu-system-microblazeel Apr 18 16:10:00 user nova-compute[70975]: petalogix-s3adsp1800 Apr 18 16:10:00 user nova-compute[70975]: petalogix-ml605 Apr 18 16:10:00 user nova-compute[70975]: xlnx-zynqmp-pmu Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: hvm Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: 32 Apr 18 16:10:00 user nova-compute[70975]: /usr/bin/qemu-system-mips Apr 18 16:10:00 user nova-compute[70975]: malta Apr 18 16:10:00 user nova-compute[70975]: mipssim Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: hvm Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: 32 Apr 18 16:10:00 user nova-compute[70975]: /usr/bin/qemu-system-mipsel Apr 18 16:10:00 user nova-compute[70975]: malta Apr 18 16:10:00 user nova-compute[70975]: mipssim Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: hvm Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: 64 Apr 18 16:10:00 user nova-compute[70975]: /usr/bin/qemu-system-mips64 Apr 18 16:10:00 user nova-compute[70975]: malta Apr 18 16:10:00 user nova-compute[70975]: mipssim Apr 18 16:10:00 user nova-compute[70975]: pica61 Apr 18 16:10:00 user nova-compute[70975]: magnum Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: hvm Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: 64 Apr 18 16:10:00 user nova-compute[70975]: /usr/bin/qemu-system-mips64el Apr 18 16:10:00 user nova-compute[70975]: malta Apr 18 16:10:00 user nova-compute[70975]: loongson3-virt Apr 18 16:10:00 user nova-compute[70975]: mipssim Apr 18 16:10:00 user nova-compute[70975]: pica61 Apr 18 16:10:00 user nova-compute[70975]: magnum Apr 18 16:10:00 user nova-compute[70975]: boston Apr 18 16:10:00 user nova-compute[70975]: fuloong2e Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: hvm Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: 32 Apr 18 16:10:00 user nova-compute[70975]: /usr/bin/qemu-system-ppc Apr 18 16:10:00 user nova-compute[70975]: g3beige Apr 18 16:10:00 user nova-compute[70975]: virtex-ml507 Apr 18 16:10:00 user nova-compute[70975]: mac99 Apr 18 16:10:00 user nova-compute[70975]: ppce500 Apr 18 16:10:00 user nova-compute[70975]: pegasos2 Apr 18 16:10:00 user nova-compute[70975]: sam460ex Apr 18 16:10:00 user nova-compute[70975]: bamboo Apr 18 16:10:00 user nova-compute[70975]: 40p Apr 18 16:10:00 user nova-compute[70975]: ref405ep Apr 18 16:10:00 user nova-compute[70975]: mpc8544ds Apr 18 16:10:00 user nova-compute[70975]: taihu Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: hvm Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: 64 Apr 18 16:10:00 user nova-compute[70975]: /usr/bin/qemu-system-ppc64 Apr 18 16:10:00 user nova-compute[70975]: pseries-jammy Apr 18 16:10:00 user nova-compute[70975]: pseries Apr 18 16:10:00 user nova-compute[70975]: powernv9 Apr 18 16:10:00 user nova-compute[70975]: powernv Apr 18 16:10:00 user nova-compute[70975]: taihu Apr 18 16:10:00 user nova-compute[70975]: pseries-4.1 Apr 18 16:10:00 user nova-compute[70975]: mpc8544ds Apr 18 16:10:00 user nova-compute[70975]: pseries-6.1 Apr 18 16:10:00 user nova-compute[70975]: pseries-2.5 Apr 18 16:10:00 user nova-compute[70975]: powernv10 Apr 18 16:10:00 user nova-compute[70975]: pseries-xenial Apr 18 16:10:00 user nova-compute[70975]: pseries-4.2 Apr 18 16:10:00 user nova-compute[70975]: pseries-6.2 Apr 18 16:10:00 user nova-compute[70975]: pseries-yakkety Apr 18 16:10:00 user nova-compute[70975]: pseries-2.6 Apr 18 16:10:00 user nova-compute[70975]: ppce500 Apr 18 16:10:00 user nova-compute[70975]: pseries-bionic-sxxm Apr 18 16:10:00 user nova-compute[70975]: pseries-2.7 Apr 18 16:10:00 user nova-compute[70975]: pseries-3.0 Apr 18 16:10:00 user nova-compute[70975]: pseries-5.0 Apr 18 16:10:00 user nova-compute[70975]: 40p Apr 18 16:10:00 user nova-compute[70975]: pseries-2.8 Apr 18 16:10:00 user nova-compute[70975]: pegasos2 Apr 18 16:10:00 user nova-compute[70975]: pseries-hirsute Apr 18 16:10:00 user nova-compute[70975]: pseries-3.1 Apr 18 16:10:00 user nova-compute[70975]: pseries-5.1 Apr 18 16:10:00 user nova-compute[70975]: pseries-eoan Apr 18 16:10:00 user nova-compute[70975]: pseries-2.9 Apr 18 16:10:00 user nova-compute[70975]: pseries-zesty Apr 18 16:10:00 user nova-compute[70975]: bamboo Apr 18 16:10:00 user nova-compute[70975]: pseries-groovy Apr 18 16:10:00 user nova-compute[70975]: pseries-focal Apr 18 16:10:00 user nova-compute[70975]: g3beige Apr 18 16:10:00 user nova-compute[70975]: pseries-5.2 Apr 18 16:10:00 user nova-compute[70975]: pseries-disco Apr 18 16:10:00 user nova-compute[70975]: pseries-2.12-sxxm Apr 18 16:10:00 user nova-compute[70975]: pseries-2.10 Apr 18 16:10:00 user nova-compute[70975]: virtex-ml507 Apr 18 16:10:00 user nova-compute[70975]: pseries-2.11 Apr 18 16:10:00 user nova-compute[70975]: pseries-2.1 Apr 18 16:10:00 user nova-compute[70975]: pseries-cosmic Apr 18 16:10:00 user nova-compute[70975]: pseries-bionic Apr 18 16:10:00 user nova-compute[70975]: pseries-2.12 Apr 18 16:10:00 user nova-compute[70975]: pseries-2.2 Apr 18 16:10:00 user nova-compute[70975]: mac99 Apr 18 16:10:00 user nova-compute[70975]: pseries-impish Apr 18 16:10:00 user nova-compute[70975]: pseries-artful Apr 18 16:10:00 user nova-compute[70975]: sam460ex Apr 18 16:10:00 user nova-compute[70975]: ref405ep Apr 18 16:10:00 user nova-compute[70975]: pseries-2.3 Apr 18 16:10:00 user nova-compute[70975]: powernv8 Apr 18 16:10:00 user nova-compute[70975]: pseries-4.0 Apr 18 16:10:00 user nova-compute[70975]: pseries-6.0 Apr 18 16:10:00 user nova-compute[70975]: pseries-2.4 Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: hvm Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: 64 Apr 18 16:10:00 user nova-compute[70975]: /usr/bin/qemu-system-ppc64le Apr 18 16:10:00 user nova-compute[70975]: pseries-jammy Apr 18 16:10:00 user nova-compute[70975]: pseries Apr 18 16:10:00 user nova-compute[70975]: powernv9 Apr 18 16:10:00 user nova-compute[70975]: powernv Apr 18 16:10:00 user nova-compute[70975]: taihu Apr 18 16:10:00 user nova-compute[70975]: pseries-4.1 Apr 18 16:10:00 user nova-compute[70975]: mpc8544ds Apr 18 16:10:00 user nova-compute[70975]: pseries-6.1 Apr 18 16:10:00 user nova-compute[70975]: pseries-2.5 Apr 18 16:10:00 user nova-compute[70975]: powernv10 Apr 18 16:10:00 user nova-compute[70975]: pseries-xenial Apr 18 16:10:00 user nova-compute[70975]: pseries-4.2 Apr 18 16:10:00 user nova-compute[70975]: pseries-6.2 Apr 18 16:10:00 user nova-compute[70975]: pseries-yakkety Apr 18 16:10:00 user nova-compute[70975]: pseries-2.6 Apr 18 16:10:00 user nova-compute[70975]: ppce500 Apr 18 16:10:00 user nova-compute[70975]: pseries-bionic-sxxm Apr 18 16:10:00 user nova-compute[70975]: pseries-2.7 Apr 18 16:10:00 user nova-compute[70975]: pseries-3.0 Apr 18 16:10:00 user nova-compute[70975]: pseries-5.0 Apr 18 16:10:00 user nova-compute[70975]: 40p Apr 18 16:10:00 user nova-compute[70975]: pseries-2.8 Apr 18 16:10:00 user nova-compute[70975]: pegasos2 Apr 18 16:10:00 user nova-compute[70975]: pseries-hirsute Apr 18 16:10:00 user nova-compute[70975]: pseries-3.1 Apr 18 16:10:00 user nova-compute[70975]: pseries-5.1 Apr 18 16:10:00 user nova-compute[70975]: pseries-eoan Apr 18 16:10:00 user nova-compute[70975]: pseries-2.9 Apr 18 16:10:00 user nova-compute[70975]: pseries-zesty Apr 18 16:10:00 user nova-compute[70975]: bamboo Apr 18 16:10:00 user nova-compute[70975]: pseries-groovy Apr 18 16:10:00 user nova-compute[70975]: pseries-focal Apr 18 16:10:00 user nova-compute[70975]: g3beige Apr 18 16:10:00 user nova-compute[70975]: pseries-5.2 Apr 18 16:10:00 user nova-compute[70975]: pseries-disco Apr 18 16:10:00 user nova-compute[70975]: pseries-2.12-sxxm Apr 18 16:10:00 user nova-compute[70975]: pseries-2.10 Apr 18 16:10:00 user nova-compute[70975]: virtex-ml507 Apr 18 16:10:00 user nova-compute[70975]: pseries-2.11 Apr 18 16:10:00 user nova-compute[70975]: pseries-2.1 Apr 18 16:10:00 user nova-compute[70975]: pseries-cosmic Apr 18 16:10:00 user nova-compute[70975]: pseries-bionic Apr 18 16:10:00 user nova-compute[70975]: pseries-2.12 Apr 18 16:10:00 user nova-compute[70975]: pseries-2.2 Apr 18 16:10:00 user nova-compute[70975]: mac99 Apr 18 16:10:00 user nova-compute[70975]: pseries-impish Apr 18 16:10:00 user nova-compute[70975]: pseries-artful Apr 18 16:10:00 user nova-compute[70975]: sam460ex Apr 18 16:10:00 user nova-compute[70975]: ref405ep Apr 18 16:10:00 user nova-compute[70975]: pseries-2.3 Apr 18 16:10:00 user nova-compute[70975]: powernv8 Apr 18 16:10:00 user nova-compute[70975]: pseries-4.0 Apr 18 16:10:00 user nova-compute[70975]: pseries-6.0 Apr 18 16:10:00 user nova-compute[70975]: pseries-2.4 Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: hvm Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: 32 Apr 18 16:10:00 user nova-compute[70975]: /usr/bin/qemu-system-riscv32 Apr 18 16:10:00 user nova-compute[70975]: spike Apr 18 16:10:00 user nova-compute[70975]: opentitan Apr 18 16:10:00 user nova-compute[70975]: sifive_u Apr 18 16:10:00 user nova-compute[70975]: sifive_e Apr 18 16:10:00 user nova-compute[70975]: virt Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: hvm Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: 64 Apr 18 16:10:00 user nova-compute[70975]: /usr/bin/qemu-system-riscv64 Apr 18 16:10:00 user nova-compute[70975]: spike Apr 18 16:10:00 user nova-compute[70975]: microchip-icicle-kit Apr 18 16:10:00 user nova-compute[70975]: sifive_u Apr 18 16:10:00 user nova-compute[70975]: shakti_c Apr 18 16:10:00 user nova-compute[70975]: sifive_e Apr 18 16:10:00 user nova-compute[70975]: virt Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: hvm Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: 64 Apr 18 16:10:00 user nova-compute[70975]: /usr/bin/qemu-system-s390x Apr 18 16:10:00 user nova-compute[70975]: s390-ccw-virtio-jammy Apr 18 16:10:00 user nova-compute[70975]: s390-ccw-virtio Apr 18 16:10:00 user nova-compute[70975]: s390-ccw-virtio-4.0 Apr 18 16:10:00 user nova-compute[70975]: s390-ccw-virtio-5.2 Apr 18 16:10:00 user nova-compute[70975]: s390-ccw-virtio-artful Apr 18 16:10:00 user nova-compute[70975]: s390-ccw-virtio-3.1 Apr 18 16:10:00 user nova-compute[70975]: s390-ccw-virtio-groovy Apr 18 16:10:00 user nova-compute[70975]: s390-ccw-virtio-hirsute Apr 18 16:10:00 user nova-compute[70975]: s390-ccw-virtio-disco Apr 18 16:10:00 user nova-compute[70975]: s390-ccw-virtio-2.12 Apr 18 16:10:00 user nova-compute[70975]: s390-ccw-virtio-2.6 Apr 18 16:10:00 user nova-compute[70975]: s390-ccw-virtio-yakkety Apr 18 16:10:00 user nova-compute[70975]: s390-ccw-virtio-eoan Apr 18 16:10:00 user nova-compute[70975]: s390-ccw-virtio-2.9 Apr 18 16:10:00 user nova-compute[70975]: s390-ccw-virtio-6.0 Apr 18 16:10:00 user nova-compute[70975]: s390-ccw-virtio-5.1 Apr 18 16:10:00 user nova-compute[70975]: s390-ccw-virtio-3.0 Apr 18 16:10:00 user nova-compute[70975]: s390-ccw-virtio-4.2 Apr 18 16:10:00 user nova-compute[70975]: s390-ccw-virtio-2.5 Apr 18 16:10:00 user nova-compute[70975]: s390-ccw-virtio-2.11 Apr 18 16:10:00 user nova-compute[70975]: s390-ccw-virtio-xenial Apr 18 16:10:00 user nova-compute[70975]: s390-ccw-virtio-focal Apr 18 16:10:00 user nova-compute[70975]: s390-ccw-virtio-2.8 Apr 18 16:10:00 user nova-compute[70975]: s390-ccw-virtio-impish Apr 18 16:10:00 user nova-compute[70975]: s390-ccw-virtio-bionic Apr 18 16:10:00 user nova-compute[70975]: s390-ccw-virtio-5.0 Apr 18 16:10:00 user nova-compute[70975]: s390-ccw-virtio-6.2 Apr 18 16:10:00 user nova-compute[70975]: s390-ccw-virtio-zesty Apr 18 16:10:00 user nova-compute[70975]: s390-ccw-virtio-4.1 Apr 18 16:10:00 user nova-compute[70975]: s390-ccw-virtio-cosmic Apr 18 16:10:00 user nova-compute[70975]: s390-ccw-virtio-2.4 Apr 18 16:10:00 user nova-compute[70975]: s390-ccw-virtio-2.10 Apr 18 16:10:00 user nova-compute[70975]: s390-ccw-virtio-2.7 Apr 18 16:10:00 user nova-compute[70975]: s390-ccw-virtio-6.1 Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: hvm Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: 32 Apr 18 16:10:00 user nova-compute[70975]: /usr/bin/qemu-system-sh4 Apr 18 16:10:00 user nova-compute[70975]: shix Apr 18 16:10:00 user nova-compute[70975]: r2d Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: hvm Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: 64 Apr 18 16:10:00 user nova-compute[70975]: /usr/bin/qemu-system-sh4eb Apr 18 16:10:00 user nova-compute[70975]: shix Apr 18 16:10:00 user nova-compute[70975]: r2d Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: hvm Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: 32 Apr 18 16:10:00 user nova-compute[70975]: /usr/bin/qemu-system-sparc Apr 18 16:10:00 user nova-compute[70975]: SS-5 Apr 18 16:10:00 user nova-compute[70975]: SS-20 Apr 18 16:10:00 user nova-compute[70975]: LX Apr 18 16:10:00 user nova-compute[70975]: SPARCClassic Apr 18 16:10:00 user nova-compute[70975]: leon3_generic Apr 18 16:10:00 user nova-compute[70975]: SPARCbook Apr 18 16:10:00 user nova-compute[70975]: SS-4 Apr 18 16:10:00 user nova-compute[70975]: SS-600MP Apr 18 16:10:00 user nova-compute[70975]: SS-10 Apr 18 16:10:00 user nova-compute[70975]: Voyager Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: hvm Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: 64 Apr 18 16:10:00 user nova-compute[70975]: /usr/bin/qemu-system-sparc64 Apr 18 16:10:00 user nova-compute[70975]: sun4u Apr 18 16:10:00 user nova-compute[70975]: niagara Apr 18 16:10:00 user nova-compute[70975]: sun4v Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: hvm Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: 64 Apr 18 16:10:00 user nova-compute[70975]: /usr/bin/qemu-system-x86_64 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-jammy Apr 18 16:10:00 user nova-compute[70975]: ubuntu Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-impish-hpb Apr 18 16:10:00 user nova-compute[70975]: pc-q35-5.2 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-2.12 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-2.0 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-xenial Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-6.2 Apr 18 16:10:00 user nova-compute[70975]: pc Apr 18 16:10:00 user nova-compute[70975]: pc-q35-4.2 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-2.5 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-4.2 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-hirsute Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-focal Apr 18 16:10:00 user nova-compute[70975]: pc-q35-xenial Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-jammy-hpb Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-5.2 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-1.5 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-2.7 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-eoan-hpb Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-zesty Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-disco-hpb Apr 18 16:10:00 user nova-compute[70975]: pc-q35-groovy Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-groovy Apr 18 16:10:00 user nova-compute[70975]: pc-q35-artful Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-trusty Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-2.2 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-focal-hpb Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-eoan-hpb Apr 18 16:10:00 user nova-compute[70975]: pc-q35-bionic-hpb Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-artful Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-2.7 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-6.1 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-yakkety Apr 18 16:10:00 user nova-compute[70975]: pc-q35-2.4 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-cosmic-hpb Apr 18 16:10:00 user nova-compute[70975]: pc-q35-2.10 Apr 18 16:10:00 user nova-compute[70975]: x-remote Apr 18 16:10:00 user nova-compute[70975]: pc-q35-5.1 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-1.7 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-2.9 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-2.11 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-3.1 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-6.1 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-4.1 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-jammy Apr 18 16:10:00 user nova-compute[70975]: ubuntu-q35 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-2.4 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-4.1 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-eoan Apr 18 16:10:00 user nova-compute[70975]: pc-q35-jammy-hpb Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-5.1 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-2.9 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-bionic-hpb Apr 18 16:10:00 user nova-compute[70975]: isapc Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-1.4 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-cosmic Apr 18 16:10:00 user nova-compute[70975]: pc-q35-2.6 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-3.1 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-bionic Apr 18 16:10:00 user nova-compute[70975]: pc-q35-disco-hpb Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-cosmic Apr 18 16:10:00 user nova-compute[70975]: pc-q35-2.12 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-bionic Apr 18 16:10:00 user nova-compute[70975]: pc-q35-groovy-hpb Apr 18 16:10:00 user nova-compute[70975]: pc-q35-disco Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-cosmic-hpb Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-2.1 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-wily Apr 18 16:10:00 user nova-compute[70975]: pc-q35-impish Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-2.6 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-6.0 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-impish Apr 18 16:10:00 user nova-compute[70975]: pc-q35-impish-hpb Apr 18 16:10:00 user nova-compute[70975]: pc-q35-hirsute Apr 18 16:10:00 user nova-compute[70975]: pc-q35-4.0.1 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-hirsute-hpb Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-1.6 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-5.0 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-2.8 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-2.10 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-3.0 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-zesty Apr 18 16:10:00 user nova-compute[70975]: pc-q35-4.0 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-focal Apr 18 16:10:00 user nova-compute[70975]: microvm Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-6.0 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-2.3 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-disco Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-focal-hpb Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-4.0 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-groovy-hpb Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-hirsute-hpb Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-5.0 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-2.8 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-6.2 Apr 18 16:10:00 user nova-compute[70975]: q35 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-eoan Apr 18 16:10:00 user nova-compute[70975]: pc-q35-2.5 Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-3.0 Apr 18 16:10:00 user nova-compute[70975]: pc-q35-yakkety Apr 18 16:10:00 user nova-compute[70975]: pc-q35-2.11 Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: hvm Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: 32 Apr 18 16:10:00 user nova-compute[70975]: /usr/bin/qemu-system-xtensa Apr 18 16:10:00 user nova-compute[70975]: sim Apr 18 16:10:00 user nova-compute[70975]: kc705 Apr 18 16:10:00 user nova-compute[70975]: ml605 Apr 18 16:10:00 user nova-compute[70975]: ml605-nommu Apr 18 16:10:00 user nova-compute[70975]: virt Apr 18 16:10:00 user nova-compute[70975]: lx60-nommu Apr 18 16:10:00 user nova-compute[70975]: lx200 Apr 18 16:10:00 user nova-compute[70975]: lx200-nommu Apr 18 16:10:00 user nova-compute[70975]: lx60 Apr 18 16:10:00 user nova-compute[70975]: kc705-nommu Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: hvm Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: 32 Apr 18 16:10:00 user nova-compute[70975]: /usr/bin/qemu-system-xtensaeb Apr 18 16:10:00 user nova-compute[70975]: sim Apr 18 16:10:00 user nova-compute[70975]: kc705 Apr 18 16:10:00 user nova-compute[70975]: ml605 Apr 18 16:10:00 user nova-compute[70975]: ml605-nommu Apr 18 16:10:00 user nova-compute[70975]: virt Apr 18 16:10:00 user nova-compute[70975]: lx60-nommu Apr 18 16:10:00 user nova-compute[70975]: lx200 Apr 18 16:10:00 user nova-compute[70975]: lx200-nommu Apr 18 16:10:00 user nova-compute[70975]: lx60 Apr 18 16:10:00 user nova-compute[70975]: kc705-nommu Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Getting domain capabilities for alpha via machine types: {None} {{(pid=70975) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Error from libvirt when retrieving domain capabilities for arch alpha / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-alpha' on this host {{(pid=70975) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Getting domain capabilities for armv6l via machine types: {'virt', None} {{(pid=70975) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Error from libvirt when retrieving domain capabilities for arch armv6l / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-arm' on this host {{(pid=70975) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Error from libvirt when retrieving domain capabilities for arch armv6l / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-arm' on this host {{(pid=70975) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Getting domain capabilities for armv7l via machine types: {'virt'} {{(pid=70975) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Error from libvirt when retrieving domain capabilities for arch armv7l / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-arm' on this host {{(pid=70975) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Getting domain capabilities for aarch64 via machine types: {'virt'} {{(pid=70975) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Error from libvirt when retrieving domain capabilities for arch aarch64 / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-aarch64' on this host {{(pid=70975) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Getting domain capabilities for cris via machine types: {None} {{(pid=70975) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Error from libvirt when retrieving domain capabilities for arch cris / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-cris' on this host {{(pid=70975) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Getting domain capabilities for i686 via machine types: {'ubuntu-q35', 'q35', 'pc', 'ubuntu'} {{(pid=70975) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=ubuntu-q35: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: /usr/bin/qemu-system-i386 Apr 18 16:10:00 user nova-compute[70975]: kvm Apr 18 16:10:00 user nova-compute[70975]: pc-q35-jammy Apr 18 16:10:00 user nova-compute[70975]: i686 Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: /usr/share/OVMF/OVMF_CODE.fd Apr 18 16:10:00 user nova-compute[70975]: /usr/share/OVMF/OVMF_CODE.secboot.fd Apr 18 16:10:00 user nova-compute[70975]: /usr/share/AAVMF/AAVMF_CODE.fd Apr 18 16:10:00 user nova-compute[70975]: /usr/share/AAVMF/AAVMF32_CODE.fd Apr 18 16:10:00 user nova-compute[70975]: /usr/share/OVMF/OVMF_CODE.ms.fd Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: rom Apr 18 16:10:00 user nova-compute[70975]: pflash Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: yes Apr 18 16:10:00 user nova-compute[70975]: no Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: no Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: off Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: on Apr 18 16:10:00 user nova-compute[70975]: off Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: IvyBridge-IBRS Apr 18 16:10:00 user nova-compute[70975]: Intel Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: qemu64 Apr 18 16:10:00 user nova-compute[70975]: qemu32 Apr 18 16:10:00 user nova-compute[70975]: phenom Apr 18 16:10:00 user nova-compute[70975]: pentium3 Apr 18 16:10:00 user nova-compute[70975]: pentium2 Apr 18 16:10:00 user nova-compute[70975]: pentium Apr 18 16:10:00 user nova-compute[70975]: n270 Apr 18 16:10:00 user nova-compute[70975]: kvm64 Apr 18 16:10:00 user nova-compute[70975]: kvm32 Apr 18 16:10:00 user nova-compute[70975]: coreduo Apr 18 16:10:00 user nova-compute[70975]: core2duo Apr 18 16:10:00 user nova-compute[70975]: athlon Apr 18 16:10:00 user nova-compute[70975]: Westmere-IBRS Apr 18 16:10:00 user nova-compute[70975]: Westmere Apr 18 16:10:00 user nova-compute[70975]: Snowridge Apr 18 16:10:00 user nova-compute[70975]: Skylake-Server-noTSX-IBRS Apr 18 16:10:00 user nova-compute[70975]: Skylake-Server-IBRS Apr 18 16:10:00 user nova-compute[70975]: Skylake-Server Apr 18 16:10:00 user nova-compute[70975]: Skylake-Client-noTSX-IBRS Apr 18 16:10:00 user nova-compute[70975]: Skylake-Client-IBRS Apr 18 16:10:00 user nova-compute[70975]: Skylake-Client Apr 18 16:10:00 user nova-compute[70975]: SandyBridge-IBRS Apr 18 16:10:00 user nova-compute[70975]: SandyBridge Apr 18 16:10:00 user nova-compute[70975]: Penryn Apr 18 16:10:00 user nova-compute[70975]: Opteron_G5 Apr 18 16:10:00 user nova-compute[70975]: Opteron_G4 Apr 18 16:10:00 user nova-compute[70975]: Opteron_G3 Apr 18 16:10:00 user nova-compute[70975]: Opteron_G2 Apr 18 16:10:00 user nova-compute[70975]: Opteron_G1 Apr 18 16:10:00 user nova-compute[70975]: Nehalem-IBRS Apr 18 16:10:00 user nova-compute[70975]: Nehalem Apr 18 16:10:00 user nova-compute[70975]: IvyBridge-IBRS Apr 18 16:10:00 user nova-compute[70975]: IvyBridge Apr 18 16:10:00 user nova-compute[70975]: Icelake-Server-noTSX Apr 18 16:10:00 user nova-compute[70975]: Icelake-Server Apr 18 16:10:00 user nova-compute[70975]: Icelake-Client-noTSX Apr 18 16:10:00 user nova-compute[70975]: Icelake-Client Apr 18 16:10:00 user nova-compute[70975]: Haswell-noTSX-IBRS Apr 18 16:10:00 user nova-compute[70975]: Haswell-noTSX Apr 18 16:10:00 user nova-compute[70975]: Haswell-IBRS Apr 18 16:10:00 user nova-compute[70975]: Haswell Apr 18 16:10:00 user nova-compute[70975]: EPYC-Rome Apr 18 16:10:00 user nova-compute[70975]: EPYC-Milan Apr 18 16:10:00 user nova-compute[70975]: EPYC-IBPB Apr 18 16:10:00 user nova-compute[70975]: EPYC Apr 18 16:10:00 user nova-compute[70975]: Dhyana Apr 18 16:10:00 user nova-compute[70975]: Cooperlake Apr 18 16:10:00 user nova-compute[70975]: Conroe Apr 18 16:10:00 user nova-compute[70975]: Cascadelake-Server-noTSX Apr 18 16:10:00 user nova-compute[70975]: Cascadelake-Server Apr 18 16:10:00 user nova-compute[70975]: Broadwell-noTSX-IBRS Apr 18 16:10:00 user nova-compute[70975]: Broadwell-noTSX Apr 18 16:10:00 user nova-compute[70975]: Broadwell-IBRS Apr 18 16:10:00 user nova-compute[70975]: Broadwell Apr 18 16:10:00 user nova-compute[70975]: 486 Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: file Apr 18 16:10:00 user nova-compute[70975]: anonymous Apr 18 16:10:00 user nova-compute[70975]: memfd Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: disk Apr 18 16:10:00 user nova-compute[70975]: cdrom Apr 18 16:10:00 user nova-compute[70975]: floppy Apr 18 16:10:00 user nova-compute[70975]: lun Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: fdc Apr 18 16:10:00 user nova-compute[70975]: scsi Apr 18 16:10:00 user nova-compute[70975]: virtio Apr 18 16:10:00 user nova-compute[70975]: usb Apr 18 16:10:00 user nova-compute[70975]: sata Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: virtio Apr 18 16:10:00 user nova-compute[70975]: virtio-transitional Apr 18 16:10:00 user nova-compute[70975]: virtio-non-transitional Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: sdl Apr 18 16:10:00 user nova-compute[70975]: vnc Apr 18 16:10:00 user nova-compute[70975]: spice Apr 18 16:10:00 user nova-compute[70975]: egl-headless Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: subsystem Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: default Apr 18 16:10:00 user nova-compute[70975]: mandatory Apr 18 16:10:00 user nova-compute[70975]: requisite Apr 18 16:10:00 user nova-compute[70975]: optional Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: usb Apr 18 16:10:00 user nova-compute[70975]: pci Apr 18 16:10:00 user nova-compute[70975]: scsi Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: virtio Apr 18 16:10:00 user nova-compute[70975]: virtio-transitional Apr 18 16:10:00 user nova-compute[70975]: virtio-non-transitional Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: random Apr 18 16:10:00 user nova-compute[70975]: egd Apr 18 16:10:00 user nova-compute[70975]: builtin Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: path Apr 18 16:10:00 user nova-compute[70975]: handle Apr 18 16:10:00 user nova-compute[70975]: virtiofs Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: tpm-tis Apr 18 16:10:00 user nova-compute[70975]: tpm-crb Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: passthrough Apr 18 16:10:00 user nova-compute[70975]: emulator Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: {{(pid=70975) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: /usr/bin/qemu-system-i386 Apr 18 16:10:00 user nova-compute[70975]: kvm Apr 18 16:10:00 user nova-compute[70975]: pc-q35-6.2 Apr 18 16:10:00 user nova-compute[70975]: i686 Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: /usr/share/OVMF/OVMF_CODE.fd Apr 18 16:10:00 user nova-compute[70975]: /usr/share/OVMF/OVMF_CODE.secboot.fd Apr 18 16:10:00 user nova-compute[70975]: /usr/share/AAVMF/AAVMF_CODE.fd Apr 18 16:10:00 user nova-compute[70975]: /usr/share/AAVMF/AAVMF32_CODE.fd Apr 18 16:10:00 user nova-compute[70975]: /usr/share/OVMF/OVMF_CODE.ms.fd Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: rom Apr 18 16:10:00 user nova-compute[70975]: pflash Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: yes Apr 18 16:10:00 user nova-compute[70975]: no Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: no Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: off Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: on Apr 18 16:10:00 user nova-compute[70975]: off Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: IvyBridge-IBRS Apr 18 16:10:00 user nova-compute[70975]: Intel Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: qemu64 Apr 18 16:10:00 user nova-compute[70975]: qemu32 Apr 18 16:10:00 user nova-compute[70975]: phenom Apr 18 16:10:00 user nova-compute[70975]: pentium3 Apr 18 16:10:00 user nova-compute[70975]: pentium2 Apr 18 16:10:00 user nova-compute[70975]: pentium Apr 18 16:10:00 user nova-compute[70975]: n270 Apr 18 16:10:00 user nova-compute[70975]: kvm64 Apr 18 16:10:00 user nova-compute[70975]: kvm32 Apr 18 16:10:00 user nova-compute[70975]: coreduo Apr 18 16:10:00 user nova-compute[70975]: core2duo Apr 18 16:10:00 user nova-compute[70975]: athlon Apr 18 16:10:00 user nova-compute[70975]: Westmere-IBRS Apr 18 16:10:00 user nova-compute[70975]: Westmere Apr 18 16:10:00 user nova-compute[70975]: Snowridge Apr 18 16:10:00 user nova-compute[70975]: Skylake-Server-noTSX-IBRS Apr 18 16:10:00 user nova-compute[70975]: Skylake-Server-IBRS Apr 18 16:10:00 user nova-compute[70975]: Skylake-Server Apr 18 16:10:00 user nova-compute[70975]: Skylake-Client-noTSX-IBRS Apr 18 16:10:00 user nova-compute[70975]: Skylake-Client-IBRS Apr 18 16:10:00 user nova-compute[70975]: Skylake-Client Apr 18 16:10:00 user nova-compute[70975]: SandyBridge-IBRS Apr 18 16:10:00 user nova-compute[70975]: SandyBridge Apr 18 16:10:00 user nova-compute[70975]: Penryn Apr 18 16:10:00 user nova-compute[70975]: Opteron_G5 Apr 18 16:10:00 user nova-compute[70975]: Opteron_G4 Apr 18 16:10:00 user nova-compute[70975]: Opteron_G3 Apr 18 16:10:00 user nova-compute[70975]: Opteron_G2 Apr 18 16:10:00 user nova-compute[70975]: Opteron_G1 Apr 18 16:10:00 user nova-compute[70975]: Nehalem-IBRS Apr 18 16:10:00 user nova-compute[70975]: Nehalem Apr 18 16:10:00 user nova-compute[70975]: IvyBridge-IBRS Apr 18 16:10:00 user nova-compute[70975]: IvyBridge Apr 18 16:10:00 user nova-compute[70975]: Icelake-Server-noTSX Apr 18 16:10:00 user nova-compute[70975]: Icelake-Server Apr 18 16:10:00 user nova-compute[70975]: Icelake-Client-noTSX Apr 18 16:10:00 user nova-compute[70975]: Icelake-Client Apr 18 16:10:00 user nova-compute[70975]: Haswell-noTSX-IBRS Apr 18 16:10:00 user nova-compute[70975]: Haswell-noTSX Apr 18 16:10:00 user nova-compute[70975]: Haswell-IBRS Apr 18 16:10:00 user nova-compute[70975]: Haswell Apr 18 16:10:00 user nova-compute[70975]: EPYC-Rome Apr 18 16:10:00 user nova-compute[70975]: EPYC-Milan Apr 18 16:10:00 user nova-compute[70975]: EPYC-IBPB Apr 18 16:10:00 user nova-compute[70975]: EPYC Apr 18 16:10:00 user nova-compute[70975]: Dhyana Apr 18 16:10:00 user nova-compute[70975]: Cooperlake Apr 18 16:10:00 user nova-compute[70975]: Conroe Apr 18 16:10:00 user nova-compute[70975]: Cascadelake-Server-noTSX Apr 18 16:10:00 user nova-compute[70975]: Cascadelake-Server Apr 18 16:10:00 user nova-compute[70975]: Broadwell-noTSX-IBRS Apr 18 16:10:00 user nova-compute[70975]: Broadwell-noTSX Apr 18 16:10:00 user nova-compute[70975]: Broadwell-IBRS Apr 18 16:10:00 user nova-compute[70975]: Broadwell Apr 18 16:10:00 user nova-compute[70975]: 486 Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: file Apr 18 16:10:00 user nova-compute[70975]: anonymous Apr 18 16:10:00 user nova-compute[70975]: memfd Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: disk Apr 18 16:10:00 user nova-compute[70975]: cdrom Apr 18 16:10:00 user nova-compute[70975]: floppy Apr 18 16:10:00 user nova-compute[70975]: lun Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: fdc Apr 18 16:10:00 user nova-compute[70975]: scsi Apr 18 16:10:00 user nova-compute[70975]: virtio Apr 18 16:10:00 user nova-compute[70975]: usb Apr 18 16:10:00 user nova-compute[70975]: sata Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: virtio Apr 18 16:10:00 user nova-compute[70975]: virtio-transitional Apr 18 16:10:00 user nova-compute[70975]: virtio-non-transitional Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: sdl Apr 18 16:10:00 user nova-compute[70975]: vnc Apr 18 16:10:00 user nova-compute[70975]: spice Apr 18 16:10:00 user nova-compute[70975]: egl-headless Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: subsystem Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: default Apr 18 16:10:00 user nova-compute[70975]: mandatory Apr 18 16:10:00 user nova-compute[70975]: requisite Apr 18 16:10:00 user nova-compute[70975]: optional Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: usb Apr 18 16:10:00 user nova-compute[70975]: pci Apr 18 16:10:00 user nova-compute[70975]: scsi Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: virtio Apr 18 16:10:00 user nova-compute[70975]: virtio-transitional Apr 18 16:10:00 user nova-compute[70975]: virtio-non-transitional Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: random Apr 18 16:10:00 user nova-compute[70975]: egd Apr 18 16:10:00 user nova-compute[70975]: builtin Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: path Apr 18 16:10:00 user nova-compute[70975]: handle Apr 18 16:10:00 user nova-compute[70975]: virtiofs Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: tpm-tis Apr 18 16:10:00 user nova-compute[70975]: tpm-crb Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: passthrough Apr 18 16:10:00 user nova-compute[70975]: emulator Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: {{(pid=70975) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: /usr/bin/qemu-system-i386 Apr 18 16:10:00 user nova-compute[70975]: kvm Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-6.2 Apr 18 16:10:00 user nova-compute[70975]: i686 Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: /usr/share/OVMF/OVMF_CODE.fd Apr 18 16:10:00 user nova-compute[70975]: /usr/share/OVMF/OVMF_CODE.secboot.fd Apr 18 16:10:00 user nova-compute[70975]: /usr/share/AAVMF/AAVMF_CODE.fd Apr 18 16:10:00 user nova-compute[70975]: /usr/share/AAVMF/AAVMF32_CODE.fd Apr 18 16:10:00 user nova-compute[70975]: /usr/share/OVMF/OVMF_CODE.ms.fd Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: rom Apr 18 16:10:00 user nova-compute[70975]: pflash Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: yes Apr 18 16:10:00 user nova-compute[70975]: no Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: no Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: off Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: on Apr 18 16:10:00 user nova-compute[70975]: off Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: IvyBridge-IBRS Apr 18 16:10:00 user nova-compute[70975]: Intel Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: qemu64 Apr 18 16:10:00 user nova-compute[70975]: qemu32 Apr 18 16:10:00 user nova-compute[70975]: phenom Apr 18 16:10:00 user nova-compute[70975]: pentium3 Apr 18 16:10:00 user nova-compute[70975]: pentium2 Apr 18 16:10:00 user nova-compute[70975]: pentium Apr 18 16:10:00 user nova-compute[70975]: n270 Apr 18 16:10:00 user nova-compute[70975]: kvm64 Apr 18 16:10:00 user nova-compute[70975]: kvm32 Apr 18 16:10:00 user nova-compute[70975]: coreduo Apr 18 16:10:00 user nova-compute[70975]: core2duo Apr 18 16:10:00 user nova-compute[70975]: athlon Apr 18 16:10:00 user nova-compute[70975]: Westmere-IBRS Apr 18 16:10:00 user nova-compute[70975]: Westmere Apr 18 16:10:00 user nova-compute[70975]: Snowridge Apr 18 16:10:00 user nova-compute[70975]: Skylake-Server-noTSX-IBRS Apr 18 16:10:00 user nova-compute[70975]: Skylake-Server-IBRS Apr 18 16:10:00 user nova-compute[70975]: Skylake-Server Apr 18 16:10:00 user nova-compute[70975]: Skylake-Client-noTSX-IBRS Apr 18 16:10:00 user nova-compute[70975]: Skylake-Client-IBRS Apr 18 16:10:00 user nova-compute[70975]: Skylake-Client Apr 18 16:10:00 user nova-compute[70975]: SandyBridge-IBRS Apr 18 16:10:00 user nova-compute[70975]: SandyBridge Apr 18 16:10:00 user nova-compute[70975]: Penryn Apr 18 16:10:00 user nova-compute[70975]: Opteron_G5 Apr 18 16:10:00 user nova-compute[70975]: Opteron_G4 Apr 18 16:10:00 user nova-compute[70975]: Opteron_G3 Apr 18 16:10:00 user nova-compute[70975]: Opteron_G2 Apr 18 16:10:00 user nova-compute[70975]: Opteron_G1 Apr 18 16:10:00 user nova-compute[70975]: Nehalem-IBRS Apr 18 16:10:00 user nova-compute[70975]: Nehalem Apr 18 16:10:00 user nova-compute[70975]: IvyBridge-IBRS Apr 18 16:10:00 user nova-compute[70975]: IvyBridge Apr 18 16:10:00 user nova-compute[70975]: Icelake-Server-noTSX Apr 18 16:10:00 user nova-compute[70975]: Icelake-Server Apr 18 16:10:00 user nova-compute[70975]: Icelake-Client-noTSX Apr 18 16:10:00 user nova-compute[70975]: Icelake-Client Apr 18 16:10:00 user nova-compute[70975]: Haswell-noTSX-IBRS Apr 18 16:10:00 user nova-compute[70975]: Haswell-noTSX Apr 18 16:10:00 user nova-compute[70975]: Haswell-IBRS Apr 18 16:10:00 user nova-compute[70975]: Haswell Apr 18 16:10:00 user nova-compute[70975]: EPYC-Rome Apr 18 16:10:00 user nova-compute[70975]: EPYC-Milan Apr 18 16:10:00 user nova-compute[70975]: EPYC-IBPB Apr 18 16:10:00 user nova-compute[70975]: EPYC Apr 18 16:10:00 user nova-compute[70975]: Dhyana Apr 18 16:10:00 user nova-compute[70975]: Cooperlake Apr 18 16:10:00 user nova-compute[70975]: Conroe Apr 18 16:10:00 user nova-compute[70975]: Cascadelake-Server-noTSX Apr 18 16:10:00 user nova-compute[70975]: Cascadelake-Server Apr 18 16:10:00 user nova-compute[70975]: Broadwell-noTSX-IBRS Apr 18 16:10:00 user nova-compute[70975]: Broadwell-noTSX Apr 18 16:10:00 user nova-compute[70975]: Broadwell-IBRS Apr 18 16:10:00 user nova-compute[70975]: Broadwell Apr 18 16:10:00 user nova-compute[70975]: 486 Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: file Apr 18 16:10:00 user nova-compute[70975]: anonymous Apr 18 16:10:00 user nova-compute[70975]: memfd Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: disk Apr 18 16:10:00 user nova-compute[70975]: cdrom Apr 18 16:10:00 user nova-compute[70975]: floppy Apr 18 16:10:00 user nova-compute[70975]: lun Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: ide Apr 18 16:10:00 user nova-compute[70975]: fdc Apr 18 16:10:00 user nova-compute[70975]: scsi Apr 18 16:10:00 user nova-compute[70975]: virtio Apr 18 16:10:00 user nova-compute[70975]: usb Apr 18 16:10:00 user nova-compute[70975]: sata Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: virtio Apr 18 16:10:00 user nova-compute[70975]: virtio-transitional Apr 18 16:10:00 user nova-compute[70975]: virtio-non-transitional Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: sdl Apr 18 16:10:00 user nova-compute[70975]: vnc Apr 18 16:10:00 user nova-compute[70975]: spice Apr 18 16:10:00 user nova-compute[70975]: egl-headless Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: subsystem Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: default Apr 18 16:10:00 user nova-compute[70975]: mandatory Apr 18 16:10:00 user nova-compute[70975]: requisite Apr 18 16:10:00 user nova-compute[70975]: optional Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: usb Apr 18 16:10:00 user nova-compute[70975]: pci Apr 18 16:10:00 user nova-compute[70975]: scsi Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: virtio Apr 18 16:10:00 user nova-compute[70975]: virtio-transitional Apr 18 16:10:00 user nova-compute[70975]: virtio-non-transitional Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: random Apr 18 16:10:00 user nova-compute[70975]: egd Apr 18 16:10:00 user nova-compute[70975]: builtin Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: path Apr 18 16:10:00 user nova-compute[70975]: handle Apr 18 16:10:00 user nova-compute[70975]: virtiofs Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: tpm-tis Apr 18 16:10:00 user nova-compute[70975]: tpm-crb Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: passthrough Apr 18 16:10:00 user nova-compute[70975]: emulator Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: {{(pid=70975) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=ubuntu: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: /usr/bin/qemu-system-i386 Apr 18 16:10:00 user nova-compute[70975]: kvm Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-jammy Apr 18 16:10:00 user nova-compute[70975]: i686 Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: /usr/share/OVMF/OVMF_CODE.fd Apr 18 16:10:00 user nova-compute[70975]: /usr/share/OVMF/OVMF_CODE.secboot.fd Apr 18 16:10:00 user nova-compute[70975]: /usr/share/AAVMF/AAVMF_CODE.fd Apr 18 16:10:00 user nova-compute[70975]: /usr/share/AAVMF/AAVMF32_CODE.fd Apr 18 16:10:00 user nova-compute[70975]: /usr/share/OVMF/OVMF_CODE.ms.fd Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: rom Apr 18 16:10:00 user nova-compute[70975]: pflash Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: yes Apr 18 16:10:00 user nova-compute[70975]: no Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: no Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: off Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: on Apr 18 16:10:00 user nova-compute[70975]: off Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: IvyBridge-IBRS Apr 18 16:10:00 user nova-compute[70975]: Intel Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: qemu64 Apr 18 16:10:00 user nova-compute[70975]: qemu32 Apr 18 16:10:00 user nova-compute[70975]: phenom Apr 18 16:10:00 user nova-compute[70975]: pentium3 Apr 18 16:10:00 user nova-compute[70975]: pentium2 Apr 18 16:10:00 user nova-compute[70975]: pentium Apr 18 16:10:00 user nova-compute[70975]: n270 Apr 18 16:10:00 user nova-compute[70975]: kvm64 Apr 18 16:10:00 user nova-compute[70975]: kvm32 Apr 18 16:10:00 user nova-compute[70975]: coreduo Apr 18 16:10:00 user nova-compute[70975]: core2duo Apr 18 16:10:00 user nova-compute[70975]: athlon Apr 18 16:10:00 user nova-compute[70975]: Westmere-IBRS Apr 18 16:10:00 user nova-compute[70975]: Westmere Apr 18 16:10:00 user nova-compute[70975]: Snowridge Apr 18 16:10:00 user nova-compute[70975]: Skylake-Server-noTSX-IBRS Apr 18 16:10:00 user nova-compute[70975]: Skylake-Server-IBRS Apr 18 16:10:00 user nova-compute[70975]: Skylake-Server Apr 18 16:10:00 user nova-compute[70975]: Skylake-Client-noTSX-IBRS Apr 18 16:10:00 user nova-compute[70975]: Skylake-Client-IBRS Apr 18 16:10:00 user nova-compute[70975]: Skylake-Client Apr 18 16:10:00 user nova-compute[70975]: SandyBridge-IBRS Apr 18 16:10:00 user nova-compute[70975]: SandyBridge Apr 18 16:10:00 user nova-compute[70975]: Penryn Apr 18 16:10:00 user nova-compute[70975]: Opteron_G5 Apr 18 16:10:00 user nova-compute[70975]: Opteron_G4 Apr 18 16:10:00 user nova-compute[70975]: Opteron_G3 Apr 18 16:10:00 user nova-compute[70975]: Opteron_G2 Apr 18 16:10:00 user nova-compute[70975]: Opteron_G1 Apr 18 16:10:00 user nova-compute[70975]: Nehalem-IBRS Apr 18 16:10:00 user nova-compute[70975]: Nehalem Apr 18 16:10:00 user nova-compute[70975]: IvyBridge-IBRS Apr 18 16:10:00 user nova-compute[70975]: IvyBridge Apr 18 16:10:00 user nova-compute[70975]: Icelake-Server-noTSX Apr 18 16:10:00 user nova-compute[70975]: Icelake-Server Apr 18 16:10:00 user nova-compute[70975]: Icelake-Client-noTSX Apr 18 16:10:00 user nova-compute[70975]: Icelake-Client Apr 18 16:10:00 user nova-compute[70975]: Haswell-noTSX-IBRS Apr 18 16:10:00 user nova-compute[70975]: Haswell-noTSX Apr 18 16:10:00 user nova-compute[70975]: Haswell-IBRS Apr 18 16:10:00 user nova-compute[70975]: Haswell Apr 18 16:10:00 user nova-compute[70975]: EPYC-Rome Apr 18 16:10:00 user nova-compute[70975]: EPYC-Milan Apr 18 16:10:00 user nova-compute[70975]: EPYC-IBPB Apr 18 16:10:00 user nova-compute[70975]: EPYC Apr 18 16:10:00 user nova-compute[70975]: Dhyana Apr 18 16:10:00 user nova-compute[70975]: Cooperlake Apr 18 16:10:00 user nova-compute[70975]: Conroe Apr 18 16:10:00 user nova-compute[70975]: Cascadelake-Server-noTSX Apr 18 16:10:00 user nova-compute[70975]: Cascadelake-Server Apr 18 16:10:00 user nova-compute[70975]: Broadwell-noTSX-IBRS Apr 18 16:10:00 user nova-compute[70975]: Broadwell-noTSX Apr 18 16:10:00 user nova-compute[70975]: Broadwell-IBRS Apr 18 16:10:00 user nova-compute[70975]: Broadwell Apr 18 16:10:00 user nova-compute[70975]: 486 Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: file Apr 18 16:10:00 user nova-compute[70975]: anonymous Apr 18 16:10:00 user nova-compute[70975]: memfd Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: disk Apr 18 16:10:00 user nova-compute[70975]: cdrom Apr 18 16:10:00 user nova-compute[70975]: floppy Apr 18 16:10:00 user nova-compute[70975]: lun Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: ide Apr 18 16:10:00 user nova-compute[70975]: fdc Apr 18 16:10:00 user nova-compute[70975]: scsi Apr 18 16:10:00 user nova-compute[70975]: virtio Apr 18 16:10:00 user nova-compute[70975]: usb Apr 18 16:10:00 user nova-compute[70975]: sata Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: virtio Apr 18 16:10:00 user nova-compute[70975]: virtio-transitional Apr 18 16:10:00 user nova-compute[70975]: virtio-non-transitional Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: sdl Apr 18 16:10:00 user nova-compute[70975]: vnc Apr 18 16:10:00 user nova-compute[70975]: spice Apr 18 16:10:00 user nova-compute[70975]: egl-headless Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: subsystem Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: default Apr 18 16:10:00 user nova-compute[70975]: mandatory Apr 18 16:10:00 user nova-compute[70975]: requisite Apr 18 16:10:00 user nova-compute[70975]: optional Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: usb Apr 18 16:10:00 user nova-compute[70975]: pci Apr 18 16:10:00 user nova-compute[70975]: scsi Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: virtio Apr 18 16:10:00 user nova-compute[70975]: virtio-transitional Apr 18 16:10:00 user nova-compute[70975]: virtio-non-transitional Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: random Apr 18 16:10:00 user nova-compute[70975]: egd Apr 18 16:10:00 user nova-compute[70975]: builtin Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: path Apr 18 16:10:00 user nova-compute[70975]: handle Apr 18 16:10:00 user nova-compute[70975]: virtiofs Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: tpm-tis Apr 18 16:10:00 user nova-compute[70975]: tpm-crb Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: passthrough Apr 18 16:10:00 user nova-compute[70975]: emulator Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: {{(pid=70975) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Getting domain capabilities for m68k via machine types: {'virt', None} {{(pid=70975) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Error from libvirt when retrieving domain capabilities for arch m68k / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-m68k' on this host {{(pid=70975) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Error from libvirt when retrieving domain capabilities for arch m68k / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-m68k' on this host {{(pid=70975) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Getting domain capabilities for microblaze via machine types: {None} {{(pid=70975) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Error from libvirt when retrieving domain capabilities for arch microblaze / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-microblaze' on this host {{(pid=70975) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Getting domain capabilities for microblazeel via machine types: {None} {{(pid=70975) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Error from libvirt when retrieving domain capabilities for arch microblazeel / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-microblazeel' on this host {{(pid=70975) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Getting domain capabilities for mips via machine types: {None} {{(pid=70975) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Error from libvirt when retrieving domain capabilities for arch mips / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mips' on this host {{(pid=70975) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Getting domain capabilities for mipsel via machine types: {None} {{(pid=70975) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Error from libvirt when retrieving domain capabilities for arch mipsel / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mipsel' on this host {{(pid=70975) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Getting domain capabilities for mips64 via machine types: {None} {{(pid=70975) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Error from libvirt when retrieving domain capabilities for arch mips64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mips64' on this host {{(pid=70975) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Getting domain capabilities for mips64el via machine types: {None} {{(pid=70975) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Error from libvirt when retrieving domain capabilities for arch mips64el / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mips64el' on this host {{(pid=70975) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Getting domain capabilities for ppc via machine types: {None} {{(pid=70975) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Error from libvirt when retrieving domain capabilities for arch ppc / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc' on this host {{(pid=70975) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Getting domain capabilities for ppc64 via machine types: {'pseries', 'powernv', None} {{(pid=70975) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Error from libvirt when retrieving domain capabilities for arch ppc64 / virt_type kvm / machine_type pseries: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64' on this host {{(pid=70975) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Error from libvirt when retrieving domain capabilities for arch ppc64 / virt_type kvm / machine_type powernv: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64' on this host {{(pid=70975) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Error from libvirt when retrieving domain capabilities for arch ppc64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64' on this host {{(pid=70975) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Getting domain capabilities for ppc64le via machine types: {'pseries', 'powernv'} {{(pid=70975) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Error from libvirt when retrieving domain capabilities for arch ppc64le / virt_type kvm / machine_type pseries: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64le' on this host {{(pid=70975) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Error from libvirt when retrieving domain capabilities for arch ppc64le / virt_type kvm / machine_type powernv: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64le' on this host {{(pid=70975) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Getting domain capabilities for riscv32 via machine types: {None} {{(pid=70975) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Error from libvirt when retrieving domain capabilities for arch riscv32 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-riscv32' on this host {{(pid=70975) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Getting domain capabilities for riscv64 via machine types: {None} {{(pid=70975) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Error from libvirt when retrieving domain capabilities for arch riscv64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-riscv64' on this host {{(pid=70975) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Getting domain capabilities for s390x via machine types: {'s390-ccw-virtio'} {{(pid=70975) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Error from libvirt when retrieving domain capabilities for arch s390x / virt_type kvm / machine_type s390-ccw-virtio: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-s390x' on this host {{(pid=70975) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Getting domain capabilities for sh4 via machine types: {None} {{(pid=70975) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Error from libvirt when retrieving domain capabilities for arch sh4 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sh4' on this host {{(pid=70975) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Getting domain capabilities for sh4eb via machine types: {None} {{(pid=70975) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Error from libvirt when retrieving domain capabilities for arch sh4eb / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sh4eb' on this host {{(pid=70975) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Getting domain capabilities for sparc via machine types: {None} {{(pid=70975) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Error from libvirt when retrieving domain capabilities for arch sparc / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sparc' on this host {{(pid=70975) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Getting domain capabilities for sparc64 via machine types: {None} {{(pid=70975) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Error from libvirt when retrieving domain capabilities for arch sparc64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sparc64' on this host {{(pid=70975) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Getting domain capabilities for x86_64 via machine types: {'ubuntu-q35', 'q35', 'pc', 'ubuntu'} {{(pid=70975) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=ubuntu-q35: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: /usr/bin/qemu-system-x86_64 Apr 18 16:10:00 user nova-compute[70975]: kvm Apr 18 16:10:00 user nova-compute[70975]: pc-q35-jammy Apr 18 16:10:00 user nova-compute[70975]: x86_64 Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: efi Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: /usr/share/OVMF/OVMF_CODE_4M.ms.fd Apr 18 16:10:00 user nova-compute[70975]: /usr/share/OVMF/OVMF_CODE_4M.secboot.fd Apr 18 16:10:00 user nova-compute[70975]: /usr/share/OVMF/OVMF_CODE_4M.fd Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: rom Apr 18 16:10:00 user nova-compute[70975]: pflash Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: yes Apr 18 16:10:00 user nova-compute[70975]: no Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: yes Apr 18 16:10:00 user nova-compute[70975]: no Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: on Apr 18 16:10:00 user nova-compute[70975]: off Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: on Apr 18 16:10:00 user nova-compute[70975]: off Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: IvyBridge-IBRS Apr 18 16:10:00 user nova-compute[70975]: Intel Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: qemu64 Apr 18 16:10:00 user nova-compute[70975]: qemu32 Apr 18 16:10:00 user nova-compute[70975]: phenom Apr 18 16:10:00 user nova-compute[70975]: pentium3 Apr 18 16:10:00 user nova-compute[70975]: pentium2 Apr 18 16:10:00 user nova-compute[70975]: pentium Apr 18 16:10:00 user nova-compute[70975]: n270 Apr 18 16:10:00 user nova-compute[70975]: kvm64 Apr 18 16:10:00 user nova-compute[70975]: kvm32 Apr 18 16:10:00 user nova-compute[70975]: coreduo Apr 18 16:10:00 user nova-compute[70975]: core2duo Apr 18 16:10:00 user nova-compute[70975]: athlon Apr 18 16:10:00 user nova-compute[70975]: Westmere-IBRS Apr 18 16:10:00 user nova-compute[70975]: Westmere Apr 18 16:10:00 user nova-compute[70975]: Snowridge Apr 18 16:10:00 user nova-compute[70975]: Skylake-Server-noTSX-IBRS Apr 18 16:10:00 user nova-compute[70975]: Skylake-Server-IBRS Apr 18 16:10:00 user nova-compute[70975]: Skylake-Server Apr 18 16:10:00 user nova-compute[70975]: Skylake-Client-noTSX-IBRS Apr 18 16:10:00 user nova-compute[70975]: Skylake-Client-IBRS Apr 18 16:10:00 user nova-compute[70975]: Skylake-Client Apr 18 16:10:00 user nova-compute[70975]: SandyBridge-IBRS Apr 18 16:10:00 user nova-compute[70975]: SandyBridge Apr 18 16:10:00 user nova-compute[70975]: Penryn Apr 18 16:10:00 user nova-compute[70975]: Opteron_G5 Apr 18 16:10:00 user nova-compute[70975]: Opteron_G4 Apr 18 16:10:00 user nova-compute[70975]: Opteron_G3 Apr 18 16:10:00 user nova-compute[70975]: Opteron_G2 Apr 18 16:10:00 user nova-compute[70975]: Opteron_G1 Apr 18 16:10:00 user nova-compute[70975]: Nehalem-IBRS Apr 18 16:10:00 user nova-compute[70975]: Nehalem Apr 18 16:10:00 user nova-compute[70975]: IvyBridge-IBRS Apr 18 16:10:00 user nova-compute[70975]: IvyBridge Apr 18 16:10:00 user nova-compute[70975]: Icelake-Server-noTSX Apr 18 16:10:00 user nova-compute[70975]: Icelake-Server Apr 18 16:10:00 user nova-compute[70975]: Icelake-Client-noTSX Apr 18 16:10:00 user nova-compute[70975]: Icelake-Client Apr 18 16:10:00 user nova-compute[70975]: Haswell-noTSX-IBRS Apr 18 16:10:00 user nova-compute[70975]: Haswell-noTSX Apr 18 16:10:00 user nova-compute[70975]: Haswell-IBRS Apr 18 16:10:00 user nova-compute[70975]: Haswell Apr 18 16:10:00 user nova-compute[70975]: EPYC-Rome Apr 18 16:10:00 user nova-compute[70975]: EPYC-Milan Apr 18 16:10:00 user nova-compute[70975]: EPYC-IBPB Apr 18 16:10:00 user nova-compute[70975]: EPYC Apr 18 16:10:00 user nova-compute[70975]: Dhyana Apr 18 16:10:00 user nova-compute[70975]: Cooperlake Apr 18 16:10:00 user nova-compute[70975]: Conroe Apr 18 16:10:00 user nova-compute[70975]: Cascadelake-Server-noTSX Apr 18 16:10:00 user nova-compute[70975]: Cascadelake-Server Apr 18 16:10:00 user nova-compute[70975]: Broadwell-noTSX-IBRS Apr 18 16:10:00 user nova-compute[70975]: Broadwell-noTSX Apr 18 16:10:00 user nova-compute[70975]: Broadwell-IBRS Apr 18 16:10:00 user nova-compute[70975]: Broadwell Apr 18 16:10:00 user nova-compute[70975]: 486 Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: file Apr 18 16:10:00 user nova-compute[70975]: anonymous Apr 18 16:10:00 user nova-compute[70975]: memfd Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: disk Apr 18 16:10:00 user nova-compute[70975]: cdrom Apr 18 16:10:00 user nova-compute[70975]: floppy Apr 18 16:10:00 user nova-compute[70975]: lun Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: fdc Apr 18 16:10:00 user nova-compute[70975]: scsi Apr 18 16:10:00 user nova-compute[70975]: virtio Apr 18 16:10:00 user nova-compute[70975]: usb Apr 18 16:10:00 user nova-compute[70975]: sata Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: virtio Apr 18 16:10:00 user nova-compute[70975]: virtio-transitional Apr 18 16:10:00 user nova-compute[70975]: virtio-non-transitional Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: sdl Apr 18 16:10:00 user nova-compute[70975]: vnc Apr 18 16:10:00 user nova-compute[70975]: spice Apr 18 16:10:00 user nova-compute[70975]: egl-headless Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: subsystem Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: default Apr 18 16:10:00 user nova-compute[70975]: mandatory Apr 18 16:10:00 user nova-compute[70975]: requisite Apr 18 16:10:00 user nova-compute[70975]: optional Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: usb Apr 18 16:10:00 user nova-compute[70975]: pci Apr 18 16:10:00 user nova-compute[70975]: scsi Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: virtio Apr 18 16:10:00 user nova-compute[70975]: virtio-transitional Apr 18 16:10:00 user nova-compute[70975]: virtio-non-transitional Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: random Apr 18 16:10:00 user nova-compute[70975]: egd Apr 18 16:10:00 user nova-compute[70975]: builtin Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: path Apr 18 16:10:00 user nova-compute[70975]: handle Apr 18 16:10:00 user nova-compute[70975]: virtiofs Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: tpm-tis Apr 18 16:10:00 user nova-compute[70975]: tpm-crb Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: passthrough Apr 18 16:10:00 user nova-compute[70975]: emulator Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: {{(pid=70975) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: /usr/bin/qemu-system-x86_64 Apr 18 16:10:00 user nova-compute[70975]: kvm Apr 18 16:10:00 user nova-compute[70975]: pc-q35-6.2 Apr 18 16:10:00 user nova-compute[70975]: x86_64 Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: efi Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: /usr/share/OVMF/OVMF_CODE_4M.ms.fd Apr 18 16:10:00 user nova-compute[70975]: /usr/share/OVMF/OVMF_CODE_4M.secboot.fd Apr 18 16:10:00 user nova-compute[70975]: /usr/share/OVMF/OVMF_CODE_4M.fd Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: rom Apr 18 16:10:00 user nova-compute[70975]: pflash Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: yes Apr 18 16:10:00 user nova-compute[70975]: no Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: yes Apr 18 16:10:00 user nova-compute[70975]: no Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: on Apr 18 16:10:00 user nova-compute[70975]: off Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: on Apr 18 16:10:00 user nova-compute[70975]: off Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: IvyBridge-IBRS Apr 18 16:10:00 user nova-compute[70975]: Intel Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: qemu64 Apr 18 16:10:00 user nova-compute[70975]: qemu32 Apr 18 16:10:00 user nova-compute[70975]: phenom Apr 18 16:10:00 user nova-compute[70975]: pentium3 Apr 18 16:10:00 user nova-compute[70975]: pentium2 Apr 18 16:10:00 user nova-compute[70975]: pentium Apr 18 16:10:00 user nova-compute[70975]: n270 Apr 18 16:10:00 user nova-compute[70975]: kvm64 Apr 18 16:10:00 user nova-compute[70975]: kvm32 Apr 18 16:10:00 user nova-compute[70975]: coreduo Apr 18 16:10:00 user nova-compute[70975]: core2duo Apr 18 16:10:00 user nova-compute[70975]: athlon Apr 18 16:10:00 user nova-compute[70975]: Westmere-IBRS Apr 18 16:10:00 user nova-compute[70975]: Westmere Apr 18 16:10:00 user nova-compute[70975]: Snowridge Apr 18 16:10:00 user nova-compute[70975]: Skylake-Server-noTSX-IBRS Apr 18 16:10:00 user nova-compute[70975]: Skylake-Server-IBRS Apr 18 16:10:00 user nova-compute[70975]: Skylake-Server Apr 18 16:10:00 user nova-compute[70975]: Skylake-Client-noTSX-IBRS Apr 18 16:10:00 user nova-compute[70975]: Skylake-Client-IBRS Apr 18 16:10:00 user nova-compute[70975]: Skylake-Client Apr 18 16:10:00 user nova-compute[70975]: SandyBridge-IBRS Apr 18 16:10:00 user nova-compute[70975]: SandyBridge Apr 18 16:10:00 user nova-compute[70975]: Penryn Apr 18 16:10:00 user nova-compute[70975]: Opteron_G5 Apr 18 16:10:00 user nova-compute[70975]: Opteron_G4 Apr 18 16:10:00 user nova-compute[70975]: Opteron_G3 Apr 18 16:10:00 user nova-compute[70975]: Opteron_G2 Apr 18 16:10:00 user nova-compute[70975]: Opteron_G1 Apr 18 16:10:00 user nova-compute[70975]: Nehalem-IBRS Apr 18 16:10:00 user nova-compute[70975]: Nehalem Apr 18 16:10:00 user nova-compute[70975]: IvyBridge-IBRS Apr 18 16:10:00 user nova-compute[70975]: IvyBridge Apr 18 16:10:00 user nova-compute[70975]: Icelake-Server-noTSX Apr 18 16:10:00 user nova-compute[70975]: Icelake-Server Apr 18 16:10:00 user nova-compute[70975]: Icelake-Client-noTSX Apr 18 16:10:00 user nova-compute[70975]: Icelake-Client Apr 18 16:10:00 user nova-compute[70975]: Haswell-noTSX-IBRS Apr 18 16:10:00 user nova-compute[70975]: Haswell-noTSX Apr 18 16:10:00 user nova-compute[70975]: Haswell-IBRS Apr 18 16:10:00 user nova-compute[70975]: Haswell Apr 18 16:10:00 user nova-compute[70975]: EPYC-Rome Apr 18 16:10:00 user nova-compute[70975]: EPYC-Milan Apr 18 16:10:00 user nova-compute[70975]: EPYC-IBPB Apr 18 16:10:00 user nova-compute[70975]: EPYC Apr 18 16:10:00 user nova-compute[70975]: Dhyana Apr 18 16:10:00 user nova-compute[70975]: Cooperlake Apr 18 16:10:00 user nova-compute[70975]: Conroe Apr 18 16:10:00 user nova-compute[70975]: Cascadelake-Server-noTSX Apr 18 16:10:00 user nova-compute[70975]: Cascadelake-Server Apr 18 16:10:00 user nova-compute[70975]: Broadwell-noTSX-IBRS Apr 18 16:10:00 user nova-compute[70975]: Broadwell-noTSX Apr 18 16:10:00 user nova-compute[70975]: Broadwell-IBRS Apr 18 16:10:00 user nova-compute[70975]: Broadwell Apr 18 16:10:00 user nova-compute[70975]: 486 Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: file Apr 18 16:10:00 user nova-compute[70975]: anonymous Apr 18 16:10:00 user nova-compute[70975]: memfd Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: disk Apr 18 16:10:00 user nova-compute[70975]: cdrom Apr 18 16:10:00 user nova-compute[70975]: floppy Apr 18 16:10:00 user nova-compute[70975]: lun Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: fdc Apr 18 16:10:00 user nova-compute[70975]: scsi Apr 18 16:10:00 user nova-compute[70975]: virtio Apr 18 16:10:00 user nova-compute[70975]: usb Apr 18 16:10:00 user nova-compute[70975]: sata Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: virtio Apr 18 16:10:00 user nova-compute[70975]: virtio-transitional Apr 18 16:10:00 user nova-compute[70975]: virtio-non-transitional Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: sdl Apr 18 16:10:00 user nova-compute[70975]: vnc Apr 18 16:10:00 user nova-compute[70975]: spice Apr 18 16:10:00 user nova-compute[70975]: egl-headless Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: subsystem Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: default Apr 18 16:10:00 user nova-compute[70975]: mandatory Apr 18 16:10:00 user nova-compute[70975]: requisite Apr 18 16:10:00 user nova-compute[70975]: optional Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: usb Apr 18 16:10:00 user nova-compute[70975]: pci Apr 18 16:10:00 user nova-compute[70975]: scsi Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: virtio Apr 18 16:10:00 user nova-compute[70975]: virtio-transitional Apr 18 16:10:00 user nova-compute[70975]: virtio-non-transitional Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: random Apr 18 16:10:00 user nova-compute[70975]: egd Apr 18 16:10:00 user nova-compute[70975]: builtin Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: path Apr 18 16:10:00 user nova-compute[70975]: handle Apr 18 16:10:00 user nova-compute[70975]: virtiofs Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: tpm-tis Apr 18 16:10:00 user nova-compute[70975]: tpm-crb Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: passthrough Apr 18 16:10:00 user nova-compute[70975]: emulator Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: {{(pid=70975) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: /usr/bin/qemu-system-x86_64 Apr 18 16:10:00 user nova-compute[70975]: kvm Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-6.2 Apr 18 16:10:00 user nova-compute[70975]: x86_64 Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: efi Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: /usr/share/OVMF/OVMF_CODE_4M.fd Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: rom Apr 18 16:10:00 user nova-compute[70975]: pflash Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: yes Apr 18 16:10:00 user nova-compute[70975]: no Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: no Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: on Apr 18 16:10:00 user nova-compute[70975]: off Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: on Apr 18 16:10:00 user nova-compute[70975]: off Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: IvyBridge-IBRS Apr 18 16:10:00 user nova-compute[70975]: Intel Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: qemu64 Apr 18 16:10:00 user nova-compute[70975]: qemu32 Apr 18 16:10:00 user nova-compute[70975]: phenom Apr 18 16:10:00 user nova-compute[70975]: pentium3 Apr 18 16:10:00 user nova-compute[70975]: pentium2 Apr 18 16:10:00 user nova-compute[70975]: pentium Apr 18 16:10:00 user nova-compute[70975]: n270 Apr 18 16:10:00 user nova-compute[70975]: kvm64 Apr 18 16:10:00 user nova-compute[70975]: kvm32 Apr 18 16:10:00 user nova-compute[70975]: coreduo Apr 18 16:10:00 user nova-compute[70975]: core2duo Apr 18 16:10:00 user nova-compute[70975]: athlon Apr 18 16:10:00 user nova-compute[70975]: Westmere-IBRS Apr 18 16:10:00 user nova-compute[70975]: Westmere Apr 18 16:10:00 user nova-compute[70975]: Snowridge Apr 18 16:10:00 user nova-compute[70975]: Skylake-Server-noTSX-IBRS Apr 18 16:10:00 user nova-compute[70975]: Skylake-Server-IBRS Apr 18 16:10:00 user nova-compute[70975]: Skylake-Server Apr 18 16:10:00 user nova-compute[70975]: Skylake-Client-noTSX-IBRS Apr 18 16:10:00 user nova-compute[70975]: Skylake-Client-IBRS Apr 18 16:10:00 user nova-compute[70975]: Skylake-Client Apr 18 16:10:00 user nova-compute[70975]: SandyBridge-IBRS Apr 18 16:10:00 user nova-compute[70975]: SandyBridge Apr 18 16:10:00 user nova-compute[70975]: Penryn Apr 18 16:10:00 user nova-compute[70975]: Opteron_G5 Apr 18 16:10:00 user nova-compute[70975]: Opteron_G4 Apr 18 16:10:00 user nova-compute[70975]: Opteron_G3 Apr 18 16:10:00 user nova-compute[70975]: Opteron_G2 Apr 18 16:10:00 user nova-compute[70975]: Opteron_G1 Apr 18 16:10:00 user nova-compute[70975]: Nehalem-IBRS Apr 18 16:10:00 user nova-compute[70975]: Nehalem Apr 18 16:10:00 user nova-compute[70975]: IvyBridge-IBRS Apr 18 16:10:00 user nova-compute[70975]: IvyBridge Apr 18 16:10:00 user nova-compute[70975]: Icelake-Server-noTSX Apr 18 16:10:00 user nova-compute[70975]: Icelake-Server Apr 18 16:10:00 user nova-compute[70975]: Icelake-Client-noTSX Apr 18 16:10:00 user nova-compute[70975]: Icelake-Client Apr 18 16:10:00 user nova-compute[70975]: Haswell-noTSX-IBRS Apr 18 16:10:00 user nova-compute[70975]: Haswell-noTSX Apr 18 16:10:00 user nova-compute[70975]: Haswell-IBRS Apr 18 16:10:00 user nova-compute[70975]: Haswell Apr 18 16:10:00 user nova-compute[70975]: EPYC-Rome Apr 18 16:10:00 user nova-compute[70975]: EPYC-Milan Apr 18 16:10:00 user nova-compute[70975]: EPYC-IBPB Apr 18 16:10:00 user nova-compute[70975]: EPYC Apr 18 16:10:00 user nova-compute[70975]: Dhyana Apr 18 16:10:00 user nova-compute[70975]: Cooperlake Apr 18 16:10:00 user nova-compute[70975]: Conroe Apr 18 16:10:00 user nova-compute[70975]: Cascadelake-Server-noTSX Apr 18 16:10:00 user nova-compute[70975]: Cascadelake-Server Apr 18 16:10:00 user nova-compute[70975]: Broadwell-noTSX-IBRS Apr 18 16:10:00 user nova-compute[70975]: Broadwell-noTSX Apr 18 16:10:00 user nova-compute[70975]: Broadwell-IBRS Apr 18 16:10:00 user nova-compute[70975]: Broadwell Apr 18 16:10:00 user nova-compute[70975]: 486 Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: file Apr 18 16:10:00 user nova-compute[70975]: anonymous Apr 18 16:10:00 user nova-compute[70975]: memfd Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: disk Apr 18 16:10:00 user nova-compute[70975]: cdrom Apr 18 16:10:00 user nova-compute[70975]: floppy Apr 18 16:10:00 user nova-compute[70975]: lun Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: ide Apr 18 16:10:00 user nova-compute[70975]: fdc Apr 18 16:10:00 user nova-compute[70975]: scsi Apr 18 16:10:00 user nova-compute[70975]: virtio Apr 18 16:10:00 user nova-compute[70975]: usb Apr 18 16:10:00 user nova-compute[70975]: sata Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: virtio Apr 18 16:10:00 user nova-compute[70975]: virtio-transitional Apr 18 16:10:00 user nova-compute[70975]: virtio-non-transitional Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: sdl Apr 18 16:10:00 user nova-compute[70975]: vnc Apr 18 16:10:00 user nova-compute[70975]: spice Apr 18 16:10:00 user nova-compute[70975]: egl-headless Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: subsystem Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: default Apr 18 16:10:00 user nova-compute[70975]: mandatory Apr 18 16:10:00 user nova-compute[70975]: requisite Apr 18 16:10:00 user nova-compute[70975]: optional Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: usb Apr 18 16:10:00 user nova-compute[70975]: pci Apr 18 16:10:00 user nova-compute[70975]: scsi Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: virtio Apr 18 16:10:00 user nova-compute[70975]: virtio-transitional Apr 18 16:10:00 user nova-compute[70975]: virtio-non-transitional Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: random Apr 18 16:10:00 user nova-compute[70975]: egd Apr 18 16:10:00 user nova-compute[70975]: builtin Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: path Apr 18 16:10:00 user nova-compute[70975]: handle Apr 18 16:10:00 user nova-compute[70975]: virtiofs Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: tpm-tis Apr 18 16:10:00 user nova-compute[70975]: tpm-crb Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: passthrough Apr 18 16:10:00 user nova-compute[70975]: emulator Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: {{(pid=70975) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=ubuntu: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: /usr/bin/qemu-system-x86_64 Apr 18 16:10:00 user nova-compute[70975]: kvm Apr 18 16:10:00 user nova-compute[70975]: pc-i440fx-jammy Apr 18 16:10:00 user nova-compute[70975]: x86_64 Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: efi Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: /usr/share/OVMF/OVMF_CODE_4M.fd Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: rom Apr 18 16:10:00 user nova-compute[70975]: pflash Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: yes Apr 18 16:10:00 user nova-compute[70975]: no Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: no Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: on Apr 18 16:10:00 user nova-compute[70975]: off Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: on Apr 18 16:10:00 user nova-compute[70975]: off Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: IvyBridge-IBRS Apr 18 16:10:00 user nova-compute[70975]: Intel Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: qemu64 Apr 18 16:10:00 user nova-compute[70975]: qemu32 Apr 18 16:10:00 user nova-compute[70975]: phenom Apr 18 16:10:00 user nova-compute[70975]: pentium3 Apr 18 16:10:00 user nova-compute[70975]: pentium2 Apr 18 16:10:00 user nova-compute[70975]: pentium Apr 18 16:10:00 user nova-compute[70975]: n270 Apr 18 16:10:00 user nova-compute[70975]: kvm64 Apr 18 16:10:00 user nova-compute[70975]: kvm32 Apr 18 16:10:00 user nova-compute[70975]: coreduo Apr 18 16:10:00 user nova-compute[70975]: core2duo Apr 18 16:10:00 user nova-compute[70975]: athlon Apr 18 16:10:00 user nova-compute[70975]: Westmere-IBRS Apr 18 16:10:00 user nova-compute[70975]: Westmere Apr 18 16:10:00 user nova-compute[70975]: Snowridge Apr 18 16:10:00 user nova-compute[70975]: Skylake-Server-noTSX-IBRS Apr 18 16:10:00 user nova-compute[70975]: Skylake-Server-IBRS Apr 18 16:10:00 user nova-compute[70975]: Skylake-Server Apr 18 16:10:00 user nova-compute[70975]: Skylake-Client-noTSX-IBRS Apr 18 16:10:00 user nova-compute[70975]: Skylake-Client-IBRS Apr 18 16:10:00 user nova-compute[70975]: Skylake-Client Apr 18 16:10:00 user nova-compute[70975]: SandyBridge-IBRS Apr 18 16:10:00 user nova-compute[70975]: SandyBridge Apr 18 16:10:00 user nova-compute[70975]: Penryn Apr 18 16:10:00 user nova-compute[70975]: Opteron_G5 Apr 18 16:10:00 user nova-compute[70975]: Opteron_G4 Apr 18 16:10:00 user nova-compute[70975]: Opteron_G3 Apr 18 16:10:00 user nova-compute[70975]: Opteron_G2 Apr 18 16:10:00 user nova-compute[70975]: Opteron_G1 Apr 18 16:10:00 user nova-compute[70975]: Nehalem-IBRS Apr 18 16:10:00 user nova-compute[70975]: Nehalem Apr 18 16:10:00 user nova-compute[70975]: IvyBridge-IBRS Apr 18 16:10:00 user nova-compute[70975]: IvyBridge Apr 18 16:10:00 user nova-compute[70975]: Icelake-Server-noTSX Apr 18 16:10:00 user nova-compute[70975]: Icelake-Server Apr 18 16:10:00 user nova-compute[70975]: Icelake-Client-noTSX Apr 18 16:10:00 user nova-compute[70975]: Icelake-Client Apr 18 16:10:00 user nova-compute[70975]: Haswell-noTSX-IBRS Apr 18 16:10:00 user nova-compute[70975]: Haswell-noTSX Apr 18 16:10:00 user nova-compute[70975]: Haswell-IBRS Apr 18 16:10:00 user nova-compute[70975]: Haswell Apr 18 16:10:00 user nova-compute[70975]: EPYC-Rome Apr 18 16:10:00 user nova-compute[70975]: EPYC-Milan Apr 18 16:10:00 user nova-compute[70975]: EPYC-IBPB Apr 18 16:10:00 user nova-compute[70975]: EPYC Apr 18 16:10:00 user nova-compute[70975]: Dhyana Apr 18 16:10:00 user nova-compute[70975]: Cooperlake Apr 18 16:10:00 user nova-compute[70975]: Conroe Apr 18 16:10:00 user nova-compute[70975]: Cascadelake-Server-noTSX Apr 18 16:10:00 user nova-compute[70975]: Cascadelake-Server Apr 18 16:10:00 user nova-compute[70975]: Broadwell-noTSX-IBRS Apr 18 16:10:00 user nova-compute[70975]: Broadwell-noTSX Apr 18 16:10:00 user nova-compute[70975]: Broadwell-IBRS Apr 18 16:10:00 user nova-compute[70975]: Broadwell Apr 18 16:10:00 user nova-compute[70975]: 486 Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: file Apr 18 16:10:00 user nova-compute[70975]: anonymous Apr 18 16:10:00 user nova-compute[70975]: memfd Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: disk Apr 18 16:10:00 user nova-compute[70975]: cdrom Apr 18 16:10:00 user nova-compute[70975]: floppy Apr 18 16:10:00 user nova-compute[70975]: lun Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: ide Apr 18 16:10:00 user nova-compute[70975]: fdc Apr 18 16:10:00 user nova-compute[70975]: scsi Apr 18 16:10:00 user nova-compute[70975]: virtio Apr 18 16:10:00 user nova-compute[70975]: usb Apr 18 16:10:00 user nova-compute[70975]: sata Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: virtio Apr 18 16:10:00 user nova-compute[70975]: virtio-transitional Apr 18 16:10:00 user nova-compute[70975]: virtio-non-transitional Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: sdl Apr 18 16:10:00 user nova-compute[70975]: vnc Apr 18 16:10:00 user nova-compute[70975]: spice Apr 18 16:10:00 user nova-compute[70975]: egl-headless Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: subsystem Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: default Apr 18 16:10:00 user nova-compute[70975]: mandatory Apr 18 16:10:00 user nova-compute[70975]: requisite Apr 18 16:10:00 user nova-compute[70975]: optional Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: usb Apr 18 16:10:00 user nova-compute[70975]: pci Apr 18 16:10:00 user nova-compute[70975]: scsi Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: virtio Apr 18 16:10:00 user nova-compute[70975]: virtio-transitional Apr 18 16:10:00 user nova-compute[70975]: virtio-non-transitional Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: random Apr 18 16:10:00 user nova-compute[70975]: egd Apr 18 16:10:00 user nova-compute[70975]: builtin Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: path Apr 18 16:10:00 user nova-compute[70975]: handle Apr 18 16:10:00 user nova-compute[70975]: virtiofs Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: tpm-tis Apr 18 16:10:00 user nova-compute[70975]: tpm-crb Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: passthrough Apr 18 16:10:00 user nova-compute[70975]: emulator Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: {{(pid=70975) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Getting domain capabilities for xtensa via machine types: {None} {{(pid=70975) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Error from libvirt when retrieving domain capabilities for arch xtensa / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-xtensa' on this host {{(pid=70975) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Getting domain capabilities for xtensaeb via machine types: {None} {{(pid=70975) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Error from libvirt when retrieving domain capabilities for arch xtensaeb / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-xtensaeb' on this host {{(pid=70975) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Checking secure boot support for host arch (x86_64) {{(pid=70975) supports_secure_boot /opt/stack/nova/nova/virt/libvirt/host.py:1750}} Apr 18 16:10:00 user nova-compute[70975]: INFO nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Secure Boot support detected Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] cpu compare xml: Apr 18 16:10:00 user nova-compute[70975]: Nehalem Apr 18 16:10:00 user nova-compute[70975]: Apr 18 16:10:00 user nova-compute[70975]: {{(pid=70975) _compare_cpu /opt/stack/nova/nova/virt/libvirt/driver.py:9996}} Apr 18 16:10:00 user nova-compute[70975]: INFO nova.virt.node [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Generated node identity 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 Apr 18 16:10:00 user nova-compute[70975]: INFO nova.virt.node [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Wrote node identity 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 to /opt/stack/data/nova/compute_id Apr 18 16:10:00 user nova-compute[70975]: WARNING nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Compute nodes ['161a05c2-8402-4a6a-9ad9-6fdf826a94d9'] for host user were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. Apr 18 16:10:00 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host Apr 18 16:10:00 user nova-compute[70975]: WARNING nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] No compute node record found for host user. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host user could not be found. Apr 18 16:10:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Auditing locally available compute resources for user (node: user) {{(pid=70975) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 18 16:10:00 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:10:00 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Hypervisor/Node resource view: name=user free_ram=10849MB free_disk=27.109046936035156GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70975) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:10:00 user nova-compute[70975]: WARNING nova.compute.resource_tracker [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] No compute node record for user:161a05c2-8402-4a6a-9ad9-6fdf826a94d9: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 could not be found. Apr 18 16:10:00 user nova-compute[70975]: INFO nova.compute.resource_tracker [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Compute node record created for user:user with uuid: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 18 16:10:00 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 18 16:10:01 user nova-compute[70975]: INFO nova.scheduler.client.report [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [req-d2cb8e7c-2e0a-44e3-b23c-2bcde16ca92f] Created resource provider record via placement API for resource provider with UUID 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 and name user. Apr 18 16:10:01 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] /sys/module/kvm_amd/parameters/sev does not exist {{(pid=70975) _kernel_supports_amd_sev /opt/stack/nova/nova/virt/libvirt/host.py:1766}} Apr 18 16:10:01 user nova-compute[70975]: INFO nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] kernel doesn't support AMD SEV Apr 18 16:10:01 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Updating inventory in ProviderTree for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 with inventory: {'MEMORY_MB': {'total': 16023, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 12, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 40, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 0}} {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 18 16:10:01 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70975) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 18 16:10:01 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Libvirt baseline CPU Apr 18 16:10:01 user nova-compute[70975]: x86_64 Apr 18 16:10:01 user nova-compute[70975]: Nehalem Apr 18 16:10:01 user nova-compute[70975]: Intel Apr 18 16:10:01 user nova-compute[70975]: Apr 18 16:10:01 user nova-compute[70975]: Apr 18 16:10:01 user nova-compute[70975]: {{(pid=70975) _get_guest_baseline_cpu_features /opt/stack/nova/nova/virt/libvirt/driver.py:12486}} Apr 18 16:10:01 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Updated inventory for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 16023, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 12, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 40, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} Apr 18 16:10:01 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Updating resource provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 generation from 0 to 1 during operation: update_inventory {{(pid=70975) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} Apr 18 16:10:01 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Updating inventory in ProviderTree for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 with inventory: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 18 16:10:01 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Updating resource provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 generation from 1 to 2 during operation: update_traits {{(pid=70975) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} Apr 18 16:10:01 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Compute_service record updated for user:user {{(pid=70975) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 18 16:10:01 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.646s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:10:01 user nova-compute[70975]: DEBUG nova.service [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Creating RPC server for service compute {{(pid=70975) start /opt/stack/nova/nova/service.py:182}} Apr 18 16:10:01 user nova-compute[70975]: DEBUG nova.service [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Join ServiceGroup membership for this service compute {{(pid=70975) start /opt/stack/nova/nova/service.py:199}} Apr 18 16:10:01 user nova-compute[70975]: DEBUG nova.servicegroup.drivers.db [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] DB_Driver: join new ServiceGroup member user to the compute group, service = {{(pid=70975) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} Apr 18 16:10:46 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._sync_power_states {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:10:46 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:10:52 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:10:52 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:10:52 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Starting heal instance info cache {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 18 16:10:52 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Rebuilding the list of instances to heal {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 18 16:10:52 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Didn't find any instances for network info cache update. {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 18 16:10:52 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:10:52 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:10:52 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:10:52 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:10:52 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:10:52 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:10:52 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70975) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 18 16:10:52 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager.update_available_resource {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:10:52 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:10:52 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:10:52 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:10:52 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Auditing locally available compute resources for user (node: user) {{(pid=70975) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 18 16:10:52 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:10:52 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:10:52 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Hypervisor/Node resource view: name=user free_ram=10286MB free_disk=27.022953033447266GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70975) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 18 16:10:52 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:10:52 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:10:52 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 18 16:10:52 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 18 16:10:53 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:10:53 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:10:53 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Compute_service record updated for user:user {{(pid=70975) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 18 16:10:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.138s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:11:53 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:11:53 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:11:53 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:11:53 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:11:53 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:11:53 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70975) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 18 16:11:53 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager.update_available_resource {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:11:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:11:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:11:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:11:53 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Auditing locally available compute resources for user (node: user) {{(pid=70975) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 18 16:11:53 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:11:53 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:11:53 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Hypervisor/Node resource view: name=user free_ram=10217MB free_disk=27.070148468017578GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70975) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 18 16:11:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:11:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:11:53 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 18 16:11:53 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 18 16:11:53 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:11:53 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:11:53 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Compute_service record updated for user:user {{(pid=70975) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 18 16:11:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.165s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:11:53 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:11:54 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:11:54 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Starting heal instance info cache {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 18 16:11:54 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Rebuilding the list of instances to heal {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 18 16:11:54 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Didn't find any instances for network info cache update. {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 18 16:11:54 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:11:54 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:12:53 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:12:53 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70975) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 18 16:12:54 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:12:54 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:12:54 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Starting heal instance info cache {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 18 16:12:54 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Rebuilding the list of instances to heal {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 18 16:12:54 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Didn't find any instances for network info cache update. {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 18 16:12:54 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:12:54 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:12:54 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:12:54 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:12:54 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager.update_available_resource {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:12:54 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:12:54 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:12:54 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:12:54 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Auditing locally available compute resources for user (node: user) {{(pid=70975) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 18 16:12:54 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:12:54 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:12:54 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Hypervisor/Node resource view: name=user free_ram=10208MB free_disk=26.845001220703125GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70975) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 18 16:12:54 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:12:54 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:12:54 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 18 16:12:54 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 18 16:12:54 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:12:54 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:12:54 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Compute_service record updated for user:user {{(pid=70975) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 18 16:12:54 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.135s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:12:56 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:13:54 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:13:54 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:13:54 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70975) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 18 16:13:55 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:13:55 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:13:55 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:13:56 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Acquiring lock "b9feb20a-78c0-44ac-ab87-3a68a14396aa" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:13:56 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Lock "b9feb20a-78c0-44ac-ab87-3a68a14396aa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:13:56 user nova-compute[70975]: DEBUG nova.compute.manager [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Starting instance... {{(pid=70975) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 18 16:13:56 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:13:56 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Starting heal instance info cache {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 18 16:13:56 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Rebuilding the list of instances to heal {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 18 16:13:56 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Didn't find any instances for network info cache update. {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 18 16:13:56 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:13:56 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:13:56 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager.update_available_resource {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:13:56 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:13:56 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:13:56 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:13:56 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Auditing locally available compute resources for user (node: user) {{(pid=70975) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 18 16:13:56 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:13:56 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:13:56 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Hypervisor/Node resource view: name=user free_ram=9455MB free_disk=26.86688232421875GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70975) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 18 16:13:56 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:13:56 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:13:56 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:13:56 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance b9feb20a-78c0-44ac-ab87-3a68a14396aa has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} Apr 18 16:13:56 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 18 16:13:56 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 18 16:13:57 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:13:57 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:13:57 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Compute_service record updated for user:user {{(pid=70975) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 18 16:13:57 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.216s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:13:57 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.183s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:13:57 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70975) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 18 16:13:57 user nova-compute[70975]: INFO nova.compute.claims [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Claim successful on node user Apr 18 16:13:57 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:13:57 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:13:57 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.326s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:13:57 user nova-compute[70975]: DEBUG nova.compute.manager [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Start building networks asynchronously for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 18 16:13:57 user nova-compute[70975]: DEBUG nova.compute.manager [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Allocating IP information in the background. {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 18 16:13:57 user nova-compute[70975]: DEBUG nova.network.neutron [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] allocate_for_instance() {{(pid=70975) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 18 16:13:57 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 18 16:13:57 user nova-compute[70975]: DEBUG nova.compute.manager [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Start building block device mappings for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 18 16:13:57 user nova-compute[70975]: DEBUG nova.compute.manager [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Start spawning the instance on the hypervisor. {{(pid=70975) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 18 16:13:57 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Creating instance directory {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 18 16:13:57 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Creating image(s) Apr 18 16:13:57 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Acquiring lock "/opt/stack/data/nova/instances/b9feb20a-78c0-44ac-ab87-3a68a14396aa/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:13:57 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Lock "/opt/stack/data/nova/instances/b9feb20a-78c0-44ac-ab87-3a68a14396aa/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:13:57 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Lock "/opt/stack/data/nova/instances/b9feb20a-78c0-44ac-ab87-3a68a14396aa/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.008s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:13:57 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Acquiring lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:13:57 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:13:58 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053.part --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:13:58 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:13:58 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053.part --force-share --output=json" returned: 0 in 0.126s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:13:58 user nova-compute[70975]: DEBUG nova.virt.images [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] b11a20de-f82a-4158-b53e-0a0c7a1552cb was qcow2, converting to raw {{(pid=70975) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Apr 18 16:13:58 user nova-compute[70975]: DEBUG nova.privsep.utils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=70975) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 18 16:13:58 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053.part /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053.converted {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:13:58 user nova-compute[70975]: DEBUG nova.policy [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '045e13d387f04d8eb0709154e4114bf5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'eb907be282bb4348976527807993ee58', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70975) authorize /opt/stack/nova/nova/policy.py:203}} Apr 18 16:13:58 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053.part /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053.converted" returned: 0 in 0.157s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:13:58 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053.converted --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:13:58 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053.converted --force-share --output=json" returned: 0 in 0.132s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:13:58 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.752s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:13:58 user nova-compute[70975]: INFO oslo.privsep.daemon [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova-cpu.conf', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpgrdl9q77/privsep.sock'] Apr 18 16:13:58 user sudo[79685]: stack : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/nova-rootwrap /etc/nova/rootwrap.conf privsep-helper --config-file /etc/nova/nova-cpu.conf --privsep_context nova.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpgrdl9q77/privsep.sock Apr 18 16:13:58 user sudo[79685]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1001) Apr 18 16:14:00 user sudo[79685]: pam_unix(sudo:session): session closed for user root Apr 18 16:14:00 user nova-compute[70975]: INFO oslo.privsep.daemon [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Spawned new privsep daemon via rootwrap Apr 18 16:14:00 user nova-compute[70975]: INFO oslo.privsep.daemon [-] privsep daemon starting Apr 18 16:14:00 user nova-compute[70975]: INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0 Apr 18 16:14:00 user nova-compute[70975]: INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none Apr 18 16:14:00 user nova-compute[70975]: INFO oslo.privsep.daemon [-] privsep daemon running as pid 79688 Apr 18 16:14:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.165s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Acquiring lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.154s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/b9feb20a-78c0-44ac-ab87-3a68a14396aa/disk 1073741824 {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/b9feb20a-78c0-44ac-ab87-3a68a14396aa/disk 1073741824" returned: 0 in 0.040s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.198s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.129s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:00 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Checking if we can resize image /opt/stack/data/nova/instances/b9feb20a-78c0-44ac-ab87-3a68a14396aa/disk. size=1073741824 {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 18 16:14:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b9feb20a-78c0-44ac-ab87-3a68a14396aa/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b9feb20a-78c0-44ac-ab87-3a68a14396aa/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:00 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Cannot resize image /opt/stack/data/nova/instances/b9feb20a-78c0-44ac-ab87-3a68a14396aa/disk to a smaller size. {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 18 16:14:00 user nova-compute[70975]: DEBUG nova.objects.instance [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Lazy-loading 'migration_context' on Instance uuid b9feb20a-78c0-44ac-ab87-3a68a14396aa {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:14:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Created local disks {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 18 16:14:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Ensure instance console log exists: /opt/stack/data/nova/instances/b9feb20a-78c0-44ac-ab87-3a68a14396aa/console.log {{(pid=70975) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 18 16:14:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:00 user nova-compute[70975]: DEBUG nova.network.neutron [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Successfully created port: cba845fa-9bbc-4e86-9fc9-f9458343fcc9 {{(pid=70975) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 18 16:14:03 user nova-compute[70975]: DEBUG nova.network.neutron [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Successfully updated port: cba845fa-9bbc-4e86-9fc9-f9458343fcc9 {{(pid=70975) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 18 16:14:03 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Acquiring lock "refresh_cache-b9feb20a-78c0-44ac-ab87-3a68a14396aa" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:14:03 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Acquired lock "refresh_cache-b9feb20a-78c0-44ac-ab87-3a68a14396aa" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:14:03 user nova-compute[70975]: DEBUG nova.network.neutron [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Building network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 18 16:14:03 user nova-compute[70975]: DEBUG nova.network.neutron [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Instance cache missing network info. {{(pid=70975) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG nova.compute.manager [req-23b18947-6df7-4dec-89ca-b4a30bb88745 req-5e7d85ca-96cc-4f15-a67a-5889b219cd6b service nova] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Received event network-changed-cba845fa-9bbc-4e86-9fc9-f9458343fcc9 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG nova.compute.manager [req-23b18947-6df7-4dec-89ca-b4a30bb88745 req-5e7d85ca-96cc-4f15-a67a-5889b219cd6b service nova] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Refreshing instance network info cache due to event network-changed-cba845fa-9bbc-4e86-9fc9-f9458343fcc9. {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-23b18947-6df7-4dec-89ca-b4a30bb88745 req-5e7d85ca-96cc-4f15-a67a-5889b219cd6b service nova] Acquiring lock "refresh_cache-b9feb20a-78c0-44ac-ab87-3a68a14396aa" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG nova.network.neutron [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Updating instance_info_cache with network_info: [{"id": "cba845fa-9bbc-4e86-9fc9-f9458343fcc9", "address": "fa:16:3e:db:e1:db", "network": {"id": "16a8b366-68dd-415f-bae3-c01a7603f384", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1737580312-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "eb907be282bb4348976527807993ee58", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcba845fa-9b", "ovs_interfaceid": "cba845fa-9bbc-4e86-9fc9-f9458343fcc9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Releasing lock "refresh_cache-b9feb20a-78c0-44ac-ab87-3a68a14396aa" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG nova.compute.manager [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Instance network_info: |[{"id": "cba845fa-9bbc-4e86-9fc9-f9458343fcc9", "address": "fa:16:3e:db:e1:db", "network": {"id": "16a8b366-68dd-415f-bae3-c01a7603f384", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1737580312-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "eb907be282bb4348976527807993ee58", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcba845fa-9b", "ovs_interfaceid": "cba845fa-9bbc-4e86-9fc9-f9458343fcc9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-23b18947-6df7-4dec-89ca-b4a30bb88745 req-5e7d85ca-96cc-4f15-a67a-5889b219cd6b service nova] Acquired lock "refresh_cache-b9feb20a-78c0-44ac-ab87-3a68a14396aa" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG nova.network.neutron [req-23b18947-6df7-4dec-89ca-b4a30bb88745 req-5e7d85ca-96cc-4f15-a67a-5889b219cd6b service nova] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Refreshing network info cache for port cba845fa-9bbc-4e86-9fc9-f9458343fcc9 {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Start _get_guest_xml network_info=[{"id": "cba845fa-9bbc-4e86-9fc9-f9458343fcc9", "address": "fa:16:3e:db:e1:db", "network": {"id": "16a8b366-68dd-415f-bae3-c01a7603f384", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1737580312-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "eb907be282bb4348976527807993ee58", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcba845fa-9b", "ovs_interfaceid": "cba845fa-9bbc-4e86-9fc9-f9458343fcc9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encrypted': False, 'device_type': 'disk', 'encryption_secret_uuid': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'b11a20de-f82a-4158-b53e-0a0c7a1552cb'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 18 16:14:04 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:14:04 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:14:04 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70975) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-18T16:11:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=), allow threads: True {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Flavor limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Image limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Flavor pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Image pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Got 1 possible topologies {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG nova.privsep.utils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=70975) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:13:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-421490992',display_name='tempest-DeleteServersTestJSON-server-421490992',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-deleteserverstestjson-server-421490992',id=1,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='eb907be282bb4348976527807993ee58',ramdisk_id='',reservation_id='r-7jxbdvnp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1528617807',owner_user_name='tempest-DeleteServersTestJSON-1528617807-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:13:58Z,user_data=None,user_id='045e13d387f04d8eb0709154e4114bf5',uuid=b9feb20a-78c0-44ac-ab87-3a68a14396aa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cba845fa-9bbc-4e86-9fc9-f9458343fcc9", "address": "fa:16:3e:db:e1:db", "network": {"id": "16a8b366-68dd-415f-bae3-c01a7603f384", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1737580312-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "eb907be282bb4348976527807993ee58", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcba845fa-9b", "ovs_interfaceid": "cba845fa-9bbc-4e86-9fc9-f9458343fcc9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70975) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Converting VIF {"id": "cba845fa-9bbc-4e86-9fc9-f9458343fcc9", "address": "fa:16:3e:db:e1:db", "network": {"id": "16a8b366-68dd-415f-bae3-c01a7603f384", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1737580312-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "eb907be282bb4348976527807993ee58", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcba845fa-9b", "ovs_interfaceid": "cba845fa-9bbc-4e86-9fc9-f9458343fcc9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:e1:db,bridge_name='br-int',has_traffic_filtering=True,id=cba845fa-9bbc-4e86-9fc9-f9458343fcc9,network=Network(16a8b366-68dd-415f-bae3-c01a7603f384),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcba845fa-9b') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG nova.objects.instance [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Lazy-loading 'pci_devices' on Instance uuid b9feb20a-78c0-44ac-ab87-3a68a14396aa {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] End _get_guest_xml xml= Apr 18 16:14:04 user nova-compute[70975]: b9feb20a-78c0-44ac-ab87-3a68a14396aa Apr 18 16:14:04 user nova-compute[70975]: instance-00000001 Apr 18 16:14:04 user nova-compute[70975]: 131072 Apr 18 16:14:04 user nova-compute[70975]: 1 Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: tempest-DeleteServersTestJSON-server-421490992 Apr 18 16:14:04 user nova-compute[70975]: 2023-04-18 16:14:04 Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: 128 Apr 18 16:14:04 user nova-compute[70975]: 1 Apr 18 16:14:04 user nova-compute[70975]: 0 Apr 18 16:14:04 user nova-compute[70975]: 0 Apr 18 16:14:04 user nova-compute[70975]: 1 Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: tempest-DeleteServersTestJSON-1528617807-project-member Apr 18 16:14:04 user nova-compute[70975]: tempest-DeleteServersTestJSON-1528617807 Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: OpenStack Foundation Apr 18 16:14:04 user nova-compute[70975]: OpenStack Nova Apr 18 16:14:04 user nova-compute[70975]: 0.0.0 Apr 18 16:14:04 user nova-compute[70975]: b9feb20a-78c0-44ac-ab87-3a68a14396aa Apr 18 16:14:04 user nova-compute[70975]: b9feb20a-78c0-44ac-ab87-3a68a14396aa Apr 18 16:14:04 user nova-compute[70975]: Virtual Machine Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: hvm Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Nehalem Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: /dev/urandom Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: Apr 18 16:14:04 user nova-compute[70975]: {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:13:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-421490992',display_name='tempest-DeleteServersTestJSON-server-421490992',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-deleteserverstestjson-server-421490992',id=1,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='eb907be282bb4348976527807993ee58',ramdisk_id='',reservation_id='r-7jxbdvnp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1528617807',owner_user_name='tempest-DeleteServersTestJSON-1528617807-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:13:58Z,user_data=None,user_id='045e13d387f04d8eb0709154e4114bf5',uuid=b9feb20a-78c0-44ac-ab87-3a68a14396aa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cba845fa-9bbc-4e86-9fc9-f9458343fcc9", "address": "fa:16:3e:db:e1:db", "network": {"id": "16a8b366-68dd-415f-bae3-c01a7603f384", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1737580312-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "eb907be282bb4348976527807993ee58", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcba845fa-9b", "ovs_interfaceid": "cba845fa-9bbc-4e86-9fc9-f9458343fcc9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Converting VIF {"id": "cba845fa-9bbc-4e86-9fc9-f9458343fcc9", "address": "fa:16:3e:db:e1:db", "network": {"id": "16a8b366-68dd-415f-bae3-c01a7603f384", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1737580312-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "eb907be282bb4348976527807993ee58", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcba845fa-9b", "ovs_interfaceid": "cba845fa-9bbc-4e86-9fc9-f9458343fcc9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:e1:db,bridge_name='br-int',has_traffic_filtering=True,id=cba845fa-9bbc-4e86-9fc9-f9458343fcc9,network=Network(16a8b366-68dd-415f-bae3-c01a7603f384),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcba845fa-9b') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG os_vif [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:e1:db,bridge_name='br-int',has_traffic_filtering=True,id=cba845fa-9bbc-4e86-9fc9-f9458343fcc9,network=Network(16a8b366-68dd-415f-bae3-c01a7603f384),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcba845fa-9b') {{(pid=70975) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Created schema index Interface.name {{(pid=70975) autocreate_indices /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/__init__.py:106}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Created schema index Port.name {{(pid=70975) autocreate_indices /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/__init__.py:106}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Created schema index Bridge.name {{(pid=70975) autocreate_indices /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/__init__.py:106}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] tcp:127.0.0.1:6640: entering CONNECTING {{(pid=70975) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [POLLOUT] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70975) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:14:04 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 18 16:14:04 user nova-compute[70975]: INFO oslo.privsep.daemon [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova-cpu.conf', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpk488znpl/privsep.sock'] Apr 18 16:14:05 user sudo[79710]: stack : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/nova-rootwrap /etc/nova/rootwrap.conf privsep-helper --config-file /etc/nova/nova-cpu.conf --privsep_context vif_plug_ovs.privsep.vif_plug --privsep_sock_path /tmp/tmpk488znpl/privsep.sock Apr 18 16:14:05 user sudo[79710]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1001) Apr 18 16:14:06 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:06 user sudo[79710]: pam_unix(sudo:session): session closed for user root Apr 18 16:14:07 user nova-compute[70975]: DEBUG nova.network.neutron [req-23b18947-6df7-4dec-89ca-b4a30bb88745 req-5e7d85ca-96cc-4f15-a67a-5889b219cd6b service nova] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Updated VIF entry in instance network info cache for port cba845fa-9bbc-4e86-9fc9-f9458343fcc9. {{(pid=70975) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 18 16:14:07 user nova-compute[70975]: DEBUG nova.network.neutron [req-23b18947-6df7-4dec-89ca-b4a30bb88745 req-5e7d85ca-96cc-4f15-a67a-5889b219cd6b service nova] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Updating instance_info_cache with network_info: [{"id": "cba845fa-9bbc-4e86-9fc9-f9458343fcc9", "address": "fa:16:3e:db:e1:db", "network": {"id": "16a8b366-68dd-415f-bae3-c01a7603f384", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1737580312-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "eb907be282bb4348976527807993ee58", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcba845fa-9b", "ovs_interfaceid": "cba845fa-9bbc-4e86-9fc9-f9458343fcc9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:14:07 user nova-compute[70975]: INFO oslo.privsep.daemon [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Spawned new privsep daemon via rootwrap Apr 18 16:14:07 user nova-compute[70975]: INFO oslo.privsep.daemon [-] privsep daemon starting Apr 18 16:14:07 user nova-compute[70975]: INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0 Apr 18 16:14:07 user nova-compute[70975]: INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none Apr 18 16:14:07 user nova-compute[70975]: INFO oslo.privsep.daemon [-] privsep daemon running as pid 79715 Apr 18 16:14:07 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-23b18947-6df7-4dec-89ca-b4a30bb88745 req-5e7d85ca-96cc-4f15-a67a-5889b219cd6b service nova] Releasing lock "refresh_cache-b9feb20a-78c0-44ac-ab87-3a68a14396aa" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:14:07 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:07 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:07 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:07 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcba845fa-9b, may_exist=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:14:07 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcba845fa-9b, col_values=(('external_ids', {'iface-id': 'cba845fa-9bbc-4e86-9fc9-f9458343fcc9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:db:e1:db', 'vm-uuid': 'b9feb20a-78c0-44ac-ab87-3a68a14396aa'}),)) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:14:07 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:14:07 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:07 user nova-compute[70975]: INFO os_vif [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:e1:db,bridge_name='br-int',has_traffic_filtering=True,id=cba845fa-9bbc-4e86-9fc9-f9458343fcc9,network=Network(16a8b366-68dd-415f-bae3-c01a7603f384),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcba845fa-9b') Apr 18 16:14:07 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] No BDM found with device name vda, not building metadata. {{(pid=70975) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 18 16:14:07 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] No VIF found with MAC fa:16:3e:db:e1:db, not building metadata {{(pid=70975) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 18 16:14:08 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Acquiring lock "da82d905-1ca1-403d-9598-7561e69b9704" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:08 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lock "da82d905-1ca1-403d-9598-7561e69b9704" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:08 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Acquiring lock "1b530349-680e-4def-86ef-29c340efa175" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:08 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Lock "1b530349-680e-4def-86ef-29c340efa175" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:08 user nova-compute[70975]: DEBUG nova.compute.manager [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Starting instance... {{(pid=70975) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 18 16:14:08 user nova-compute[70975]: DEBUG nova.compute.manager [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] [instance: 1b530349-680e-4def-86ef-29c340efa175] Starting instance... {{(pid=70975) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 18 16:14:08 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:08 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:08 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70975) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 18 16:14:08 user nova-compute[70975]: INFO nova.compute.claims [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Claim successful on node user Apr 18 16:14:08 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:08 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:14:08 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:14:08 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.313s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:08 user nova-compute[70975]: DEBUG nova.compute.manager [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Start building networks asynchronously for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 18 16:14:08 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.290s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:08 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70975) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 18 16:14:08 user nova-compute[70975]: INFO nova.compute.claims [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] [instance: 1b530349-680e-4def-86ef-29c340efa175] Claim successful on node user Apr 18 16:14:08 user nova-compute[70975]: DEBUG nova.compute.manager [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Allocating IP information in the background. {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 18 16:14:08 user nova-compute[70975]: DEBUG nova.network.neutron [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: da82d905-1ca1-403d-9598-7561e69b9704] allocate_for_instance() {{(pid=70975) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 18 16:14:08 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 18 16:14:08 user nova-compute[70975]: DEBUG nova.compute.manager [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Start building block device mappings for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 18 16:14:08 user nova-compute[70975]: DEBUG nova.compute.manager [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Start spawning the instance on the hypervisor. {{(pid=70975) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 18 16:14:08 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Creating instance directory {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 18 16:14:08 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Creating image(s) Apr 18 16:14:08 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Acquiring lock "/opt/stack/data/nova/instances/da82d905-1ca1-403d-9598-7561e69b9704/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:08 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lock "/opt/stack/data/nova/instances/da82d905-1ca1-403d-9598-7561e69b9704/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:08 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lock "/opt/stack/data/nova/instances/da82d905-1ca1-403d-9598-7561e69b9704/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:08 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:08 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:14:08 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:14:08 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.352s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:08 user nova-compute[70975]: DEBUG nova.compute.manager [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] [instance: 1b530349-680e-4def-86ef-29c340efa175] Start building networks asynchronously for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.181s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Acquiring lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG nova.compute.manager [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] [instance: 1b530349-680e-4def-86ef-29c340efa175] Allocating IP information in the background. {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG nova.network.neutron [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] [instance: 1b530349-680e-4def-86ef-29c340efa175] allocate_for_instance() {{(pid=70975) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:09 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] [instance: 1b530349-680e-4def-86ef-29c340efa175] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 18 16:14:09 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG nova.compute.manager [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] [instance: 1b530349-680e-4def-86ef-29c340efa175] Start building block device mappings for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.189s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/da82d905-1ca1-403d-9598-7561e69b9704/disk 1073741824 {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG nova.policy [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a8a3f45f9c6c431781fb582b8da22b0b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '261e8ba82d9e4203917afb0241a3b4fc', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70975) authorize /opt/stack/nova/nova/policy.py:203}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/da82d905-1ca1-403d-9598-7561e69b9704/disk 1073741824" returned: 0 in 0.086s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.282s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG nova.compute.manager [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] [instance: 1b530349-680e-4def-86ef-29c340efa175] Start spawning the instance on the hypervisor. {{(pid=70975) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] [instance: 1b530349-680e-4def-86ef-29c340efa175] Creating instance directory {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 18 16:14:09 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] [instance: 1b530349-680e-4def-86ef-29c340efa175] Creating image(s) Apr 18 16:14:09 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Acquiring lock "/opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Lock "/opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Lock "/opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.150s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Checking if we can resize image /opt/stack/data/nova/instances/da82d905-1ca1-403d-9598-7561e69b9704/disk. size=1073741824 {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/da82d905-1ca1-403d-9598-7561e69b9704/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.166s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Acquiring lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/da82d905-1ca1-403d-9598-7561e69b9704/disk --force-share --output=json" returned: 0 in 0.157s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Cannot resize image /opt/stack/data/nova/instances/da82d905-1ca1-403d-9598-7561e69b9704/disk to a smaller size. {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG nova.objects.instance [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lazy-loading 'migration_context' on Instance uuid da82d905-1ca1-403d-9598-7561e69b9704 {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Created local disks {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Ensure instance console log exists: /opt/stack/data/nova/instances/da82d905-1ca1-403d-9598-7561e69b9704/console.log {{(pid=70975) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.143s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk 1073741824 {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG nova.policy [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '07b7b9d8fdcf42f29e83e755f4f27380', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'caa61b19cc4e4cd4bb7d41291c40ef1f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70975) authorize /opt/stack/nova/nova/policy.py:203}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk 1073741824" returned: 0 in 0.046s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.191s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.161s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Checking if we can resize image /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk. size=1073741824 {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 18 16:14:09 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:10 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:10 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Cannot resize image /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk to a smaller size. {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 18 16:14:10 user nova-compute[70975]: DEBUG nova.objects.instance [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Lazy-loading 'migration_context' on Instance uuid 1b530349-680e-4def-86ef-29c340efa175 {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:14:10 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] [instance: 1b530349-680e-4def-86ef-29c340efa175] Created local disks {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 18 16:14:10 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] [instance: 1b530349-680e-4def-86ef-29c340efa175] Ensure instance console log exists: /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/console.log {{(pid=70975) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 18 16:14:10 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:10 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:10 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:10 user nova-compute[70975]: DEBUG nova.compute.manager [req-4646cec8-e037-4dcd-9fd5-d6170e7f8451 req-1395282e-1aa7-4aa2-941f-671fe6bc230c service nova] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Received event network-vif-plugged-cba845fa-9bbc-4e86-9fc9-f9458343fcc9 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:14:10 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-4646cec8-e037-4dcd-9fd5-d6170e7f8451 req-1395282e-1aa7-4aa2-941f-671fe6bc230c service nova] Acquiring lock "b9feb20a-78c0-44ac-ab87-3a68a14396aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:10 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-4646cec8-e037-4dcd-9fd5-d6170e7f8451 req-1395282e-1aa7-4aa2-941f-671fe6bc230c service nova] Lock "b9feb20a-78c0-44ac-ab87-3a68a14396aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:10 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-4646cec8-e037-4dcd-9fd5-d6170e7f8451 req-1395282e-1aa7-4aa2-941f-671fe6bc230c service nova] Lock "b9feb20a-78c0-44ac-ab87-3a68a14396aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:10 user nova-compute[70975]: DEBUG nova.compute.manager [req-4646cec8-e037-4dcd-9fd5-d6170e7f8451 req-1395282e-1aa7-4aa2-941f-671fe6bc230c service nova] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] No waiting events found dispatching network-vif-plugged-cba845fa-9bbc-4e86-9fc9-f9458343fcc9 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:14:10 user nova-compute[70975]: WARNING nova.compute.manager [req-4646cec8-e037-4dcd-9fd5-d6170e7f8451 req-1395282e-1aa7-4aa2-941f-671fe6bc230c service nova] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Received unexpected event network-vif-plugged-cba845fa-9bbc-4e86-9fc9-f9458343fcc9 for instance with vm_state building and task_state spawning. Apr 18 16:14:10 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Acquiring lock "d7a293bf-a9bd-424e-ba11-bbed7dfea41c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:10 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lock "d7a293bf-a9bd-424e-ba11-bbed7dfea41c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:10 user nova-compute[70975]: DEBUG nova.compute.manager [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Starting instance... {{(pid=70975) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 18 16:14:10 user nova-compute[70975]: DEBUG nova.compute.manager [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Instance event wait completed in 0 seconds for {{(pid=70975) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 18 16:14:10 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Guest created on hypervisor {{(pid=70975) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 18 16:14:10 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Resumed> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:14:10 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] VM Resumed (Lifecycle Event) Apr 18 16:14:10 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Instance spawned successfully. Apr 18 16:14:10 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 18 16:14:10 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:14:10 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Found default for hw_cdrom_bus of ide {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:10 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Found default for hw_disk_bus of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:10 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Found default for hw_input_bus of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:10 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Found default for hw_pointer_model of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:10 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Found default for hw_video_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:10 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Found default for hw_vif_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:10 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:14:10 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:10 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:10 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70975) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 18 16:14:10 user nova-compute[70975]: INFO nova.compute.claims [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Claim successful on node user Apr 18 16:14:10 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:14:10 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Started> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:14:10 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] VM Started (Lifecycle Event) Apr 18 16:14:10 user nova-compute[70975]: DEBUG nova.network.neutron [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Successfully created port: 894e80db-f051-4b32-adc8-e3afa321eb34 {{(pid=70975) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 18 16:14:10 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:14:10 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:14:10 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:14:10 user nova-compute[70975]: INFO nova.compute.manager [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Took 13.18 seconds to spawn the instance on the hypervisor. Apr 18 16:14:10 user nova-compute[70975]: DEBUG nova.compute.manager [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:14:11 user nova-compute[70975]: INFO nova.compute.manager [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Took 14.22 seconds to build instance. Apr 18 16:14:11 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:14:11 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-434b4276-b086-4dba-b8d2-6dbb1879a882 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Lock "b9feb20a-78c0-44ac-ab87-3a68a14396aa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.733s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:11 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:14:11 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.385s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:11 user nova-compute[70975]: DEBUG nova.compute.manager [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Start building networks asynchronously for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 18 16:14:11 user nova-compute[70975]: DEBUG nova.compute.manager [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Allocating IP information in the background. {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 18 16:14:11 user nova-compute[70975]: DEBUG nova.network.neutron [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] allocate_for_instance() {{(pid=70975) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 18 16:14:11 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:11 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 18 16:14:11 user nova-compute[70975]: DEBUG nova.compute.manager [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Start building block device mappings for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 18 16:14:11 user nova-compute[70975]: DEBUG nova.compute.manager [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Start spawning the instance on the hypervisor. {{(pid=70975) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 18 16:14:11 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Creating instance directory {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 18 16:14:11 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Creating image(s) Apr 18 16:14:11 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Acquiring lock "/opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:11 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lock "/opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:11 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lock "/opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:11 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:11 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.168s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:11 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Acquiring lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:11 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:11 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:11 user nova-compute[70975]: DEBUG nova.policy [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a8a3f45f9c6c431781fb582b8da22b0b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '261e8ba82d9e4203917afb0241a3b4fc', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70975) authorize /opt/stack/nova/nova/policy.py:203}} Apr 18 16:14:11 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.206s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:11 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk 1073741824 {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:11 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk 1073741824" returned: 0 in 0.060s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:11 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.279s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:11 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:12 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.183s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:12 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Checking if we can resize image /opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk. size=1073741824 {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 18 16:14:12 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:12 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:12 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk --force-share --output=json" returned: 0 in 0.191s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:12 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Cannot resize image /opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk to a smaller size. {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 18 16:14:12 user nova-compute[70975]: DEBUG nova.objects.instance [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lazy-loading 'migration_context' on Instance uuid d7a293bf-a9bd-424e-ba11-bbed7dfea41c {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:14:12 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Created local disks {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 18 16:14:12 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Ensure instance console log exists: /opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/console.log {{(pid=70975) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 18 16:14:12 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:12 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:12 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:12 user nova-compute[70975]: DEBUG nova.compute.manager [req-3ad3e915-d973-45c7-b4e3-bf4dcb6fdbe1 req-63a936c5-2fd4-488b-a92f-54c8a881fa87 service nova] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Received event network-vif-plugged-cba845fa-9bbc-4e86-9fc9-f9458343fcc9 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:14:12 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-3ad3e915-d973-45c7-b4e3-bf4dcb6fdbe1 req-63a936c5-2fd4-488b-a92f-54c8a881fa87 service nova] Acquiring lock "b9feb20a-78c0-44ac-ab87-3a68a14396aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:12 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-3ad3e915-d973-45c7-b4e3-bf4dcb6fdbe1 req-63a936c5-2fd4-488b-a92f-54c8a881fa87 service nova] Lock "b9feb20a-78c0-44ac-ab87-3a68a14396aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:12 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-3ad3e915-d973-45c7-b4e3-bf4dcb6fdbe1 req-63a936c5-2fd4-488b-a92f-54c8a881fa87 service nova] Lock "b9feb20a-78c0-44ac-ab87-3a68a14396aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:12 user nova-compute[70975]: DEBUG nova.compute.manager [req-3ad3e915-d973-45c7-b4e3-bf4dcb6fdbe1 req-63a936c5-2fd4-488b-a92f-54c8a881fa87 service nova] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] No waiting events found dispatching network-vif-plugged-cba845fa-9bbc-4e86-9fc9-f9458343fcc9 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:14:12 user nova-compute[70975]: WARNING nova.compute.manager [req-3ad3e915-d973-45c7-b4e3-bf4dcb6fdbe1 req-63a936c5-2fd4-488b-a92f-54c8a881fa87 service nova] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Received unexpected event network-vif-plugged-cba845fa-9bbc-4e86-9fc9-f9458343fcc9 for instance with vm_state active and task_state None. Apr 18 16:14:12 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:13 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:13 user nova-compute[70975]: DEBUG nova.network.neutron [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Successfully updated port: 894e80db-f051-4b32-adc8-e3afa321eb34 {{(pid=70975) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 18 16:14:13 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Acquiring lock "refresh_cache-da82d905-1ca1-403d-9598-7561e69b9704" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:14:13 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Acquired lock "refresh_cache-da82d905-1ca1-403d-9598-7561e69b9704" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:14:13 user nova-compute[70975]: DEBUG nova.network.neutron [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Building network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 18 16:14:13 user nova-compute[70975]: DEBUG nova.compute.manager [req-b77b59d6-75ba-416a-90be-3973050f1e02 req-fae0d70d-e8d7-4f03-b5d3-75a755235cf4 service nova] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Received event network-changed-894e80db-f051-4b32-adc8-e3afa321eb34 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:14:13 user nova-compute[70975]: DEBUG nova.compute.manager [req-b77b59d6-75ba-416a-90be-3973050f1e02 req-fae0d70d-e8d7-4f03-b5d3-75a755235cf4 service nova] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Refreshing instance network info cache due to event network-changed-894e80db-f051-4b32-adc8-e3afa321eb34. {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 18 16:14:13 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-b77b59d6-75ba-416a-90be-3973050f1e02 req-fae0d70d-e8d7-4f03-b5d3-75a755235cf4 service nova] Acquiring lock "refresh_cache-da82d905-1ca1-403d-9598-7561e69b9704" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:14:13 user nova-compute[70975]: DEBUG nova.network.neutron [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Instance cache missing network info. {{(pid=70975) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 18 16:14:13 user nova-compute[70975]: DEBUG nova.network.neutron [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] [instance: 1b530349-680e-4def-86ef-29c340efa175] Successfully created port: 64d26c20-add4-4a63-bace-6a3678032692 {{(pid=70975) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 18 16:14:13 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:14 user nova-compute[70975]: DEBUG nova.network.neutron [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Successfully created port: e5d69d5c-1a5c-4300-ab15-e73f78388f0e {{(pid=70975) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 18 16:14:14 user nova-compute[70975]: DEBUG nova.network.neutron [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Updating instance_info_cache with network_info: [{"id": "894e80db-f051-4b32-adc8-e3afa321eb34", "address": "fa:16:3e:ad:ba:71", "network": {"id": "1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1814061150-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "261e8ba82d9e4203917afb0241a3b4fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap894e80db-f0", "ovs_interfaceid": "894e80db-f051-4b32-adc8-e3afa321eb34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:14:14 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Releasing lock "refresh_cache-da82d905-1ca1-403d-9598-7561e69b9704" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:14:14 user nova-compute[70975]: DEBUG nova.compute.manager [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Instance network_info: |[{"id": "894e80db-f051-4b32-adc8-e3afa321eb34", "address": "fa:16:3e:ad:ba:71", "network": {"id": "1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1814061150-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "261e8ba82d9e4203917afb0241a3b4fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap894e80db-f0", "ovs_interfaceid": "894e80db-f051-4b32-adc8-e3afa321eb34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 18 16:14:14 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-b77b59d6-75ba-416a-90be-3973050f1e02 req-fae0d70d-e8d7-4f03-b5d3-75a755235cf4 service nova] Acquired lock "refresh_cache-da82d905-1ca1-403d-9598-7561e69b9704" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:14:14 user nova-compute[70975]: DEBUG nova.network.neutron [req-b77b59d6-75ba-416a-90be-3973050f1e02 req-fae0d70d-e8d7-4f03-b5d3-75a755235cf4 service nova] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Refreshing network info cache for port 894e80db-f051-4b32-adc8-e3afa321eb34 {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 18 16:14:14 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Start _get_guest_xml network_info=[{"id": "894e80db-f051-4b32-adc8-e3afa321eb34", "address": "fa:16:3e:ad:ba:71", "network": {"id": "1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1814061150-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "261e8ba82d9e4203917afb0241a3b4fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap894e80db-f0", "ovs_interfaceid": "894e80db-f051-4b32-adc8-e3afa321eb34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encrypted': False, 'device_type': 'disk', 'encryption_secret_uuid': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'b11a20de-f82a-4158-b53e-0a0c7a1552cb'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 18 16:14:14 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:14:14 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:14:14 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70975) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 18 16:14:14 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-18T16:11:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=), allow threads: True {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 18 16:14:14 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Flavor limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 18 16:14:14 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Image limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 18 16:14:14 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Flavor pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 18 16:14:14 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Image pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 18 16:14:14 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 18 16:14:14 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 18 16:14:14 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 18 16:14:14 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Got 1 possible topologies {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 18 16:14:14 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 18 16:14:14 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 18 16:14:14 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:14:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1478486128',display_name='tempest-ServerRescueNegativeTestJSON-server-1478486128',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-1478486128',id=2,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='261e8ba82d9e4203917afb0241a3b4fc',ramdisk_id='',reservation_id='r-vu07y5ik',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-1586888284',owner_user_name='tempest-ServerRescueNegativeTestJSON-1586888284-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:14:09Z,user_data=None,user_id='a8a3f45f9c6c431781fb582b8da22b0b',uuid=da82d905-1ca1-403d-9598-7561e69b9704,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "894e80db-f051-4b32-adc8-e3afa321eb34", "address": "fa:16:3e:ad:ba:71", "network": {"id": "1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1814061150-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "261e8ba82d9e4203917afb0241a3b4fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap894e80db-f0", "ovs_interfaceid": "894e80db-f051-4b32-adc8-e3afa321eb34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70975) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 18 16:14:14 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Converting VIF {"id": "894e80db-f051-4b32-adc8-e3afa321eb34", "address": "fa:16:3e:ad:ba:71", "network": {"id": "1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1814061150-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "261e8ba82d9e4203917afb0241a3b4fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap894e80db-f0", "ovs_interfaceid": "894e80db-f051-4b32-adc8-e3afa321eb34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:14:14 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ad:ba:71,bridge_name='br-int',has_traffic_filtering=True,id=894e80db-f051-4b32-adc8-e3afa321eb34,network=Network(1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap894e80db-f0') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:14:14 user nova-compute[70975]: DEBUG nova.objects.instance [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lazy-loading 'pci_devices' on Instance uuid da82d905-1ca1-403d-9598-7561e69b9704 {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:14:14 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: da82d905-1ca1-403d-9598-7561e69b9704] End _get_guest_xml xml= Apr 18 16:14:14 user nova-compute[70975]: da82d905-1ca1-403d-9598-7561e69b9704 Apr 18 16:14:14 user nova-compute[70975]: instance-00000002 Apr 18 16:14:14 user nova-compute[70975]: 131072 Apr 18 16:14:14 user nova-compute[70975]: 1 Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: tempest-ServerRescueNegativeTestJSON-server-1478486128 Apr 18 16:14:14 user nova-compute[70975]: 2023-04-18 16:14:14 Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: 128 Apr 18 16:14:14 user nova-compute[70975]: 1 Apr 18 16:14:14 user nova-compute[70975]: 0 Apr 18 16:14:14 user nova-compute[70975]: 0 Apr 18 16:14:14 user nova-compute[70975]: 1 Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: tempest-ServerRescueNegativeTestJSON-1586888284-project-member Apr 18 16:14:14 user nova-compute[70975]: tempest-ServerRescueNegativeTestJSON-1586888284 Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: OpenStack Foundation Apr 18 16:14:14 user nova-compute[70975]: OpenStack Nova Apr 18 16:14:14 user nova-compute[70975]: 0.0.0 Apr 18 16:14:14 user nova-compute[70975]: da82d905-1ca1-403d-9598-7561e69b9704 Apr 18 16:14:14 user nova-compute[70975]: da82d905-1ca1-403d-9598-7561e69b9704 Apr 18 16:14:14 user nova-compute[70975]: Virtual Machine Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: hvm Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Nehalem Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: /dev/urandom Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: Apr 18 16:14:14 user nova-compute[70975]: {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 18 16:14:14 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:14:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1478486128',display_name='tempest-ServerRescueNegativeTestJSON-server-1478486128',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-1478486128',id=2,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='261e8ba82d9e4203917afb0241a3b4fc',ramdisk_id='',reservation_id='r-vu07y5ik',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-1586888284',owner_user_name='tempest-ServerRescueNegativeTestJSON-1586888284-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:14:09Z,user_data=None,user_id='a8a3f45f9c6c431781fb582b8da22b0b',uuid=da82d905-1ca1-403d-9598-7561e69b9704,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "894e80db-f051-4b32-adc8-e3afa321eb34", "address": "fa:16:3e:ad:ba:71", "network": {"id": "1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1814061150-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "261e8ba82d9e4203917afb0241a3b4fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap894e80db-f0", "ovs_interfaceid": "894e80db-f051-4b32-adc8-e3afa321eb34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 18 16:14:14 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Converting VIF {"id": "894e80db-f051-4b32-adc8-e3afa321eb34", "address": "fa:16:3e:ad:ba:71", "network": {"id": "1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1814061150-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "261e8ba82d9e4203917afb0241a3b4fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap894e80db-f0", "ovs_interfaceid": "894e80db-f051-4b32-adc8-e3afa321eb34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:14:14 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ad:ba:71,bridge_name='br-int',has_traffic_filtering=True,id=894e80db-f051-4b32-adc8-e3afa321eb34,network=Network(1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap894e80db-f0') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:14:14 user nova-compute[70975]: DEBUG os_vif [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ad:ba:71,bridge_name='br-int',has_traffic_filtering=True,id=894e80db-f051-4b32-adc8-e3afa321eb34,network=Network(1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap894e80db-f0') {{(pid=70975) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 18 16:14:14 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:14 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:14:14 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 18 16:14:14 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:14 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap894e80db-f0, may_exist=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:14:14 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap894e80db-f0, col_values=(('external_ids', {'iface-id': '894e80db-f051-4b32-adc8-e3afa321eb34', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ad:ba:71', 'vm-uuid': 'da82d905-1ca1-403d-9598-7561e69b9704'}),)) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:14:14 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:14 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:14:14 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:14 user nova-compute[70975]: INFO os_vif [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ad:ba:71,bridge_name='br-int',has_traffic_filtering=True,id=894e80db-f051-4b32-adc8-e3afa321eb34,network=Network(1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap894e80db-f0') Apr 18 16:14:14 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] No BDM found with device name vda, not building metadata. {{(pid=70975) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 18 16:14:14 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] No VIF found with MAC fa:16:3e:ad:ba:71, not building metadata {{(pid=70975) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 18 16:14:15 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:15 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:15 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:15 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:15 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:16 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:16 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:16 user nova-compute[70975]: DEBUG nova.network.neutron [req-b77b59d6-75ba-416a-90be-3973050f1e02 req-fae0d70d-e8d7-4f03-b5d3-75a755235cf4 service nova] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Updated VIF entry in instance network info cache for port 894e80db-f051-4b32-adc8-e3afa321eb34. {{(pid=70975) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 18 16:14:16 user nova-compute[70975]: DEBUG nova.network.neutron [req-b77b59d6-75ba-416a-90be-3973050f1e02 req-fae0d70d-e8d7-4f03-b5d3-75a755235cf4 service nova] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Updating instance_info_cache with network_info: [{"id": "894e80db-f051-4b32-adc8-e3afa321eb34", "address": "fa:16:3e:ad:ba:71", "network": {"id": "1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1814061150-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "261e8ba82d9e4203917afb0241a3b4fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap894e80db-f0", "ovs_interfaceid": "894e80db-f051-4b32-adc8-e3afa321eb34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:14:16 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-b77b59d6-75ba-416a-90be-3973050f1e02 req-fae0d70d-e8d7-4f03-b5d3-75a755235cf4 service nova] Releasing lock "refresh_cache-da82d905-1ca1-403d-9598-7561e69b9704" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:14:16 user nova-compute[70975]: DEBUG nova.network.neutron [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] [instance: 1b530349-680e-4def-86ef-29c340efa175] Successfully updated port: 64d26c20-add4-4a63-bace-6a3678032692 {{(pid=70975) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 18 16:14:16 user nova-compute[70975]: DEBUG nova.network.neutron [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Successfully updated port: e5d69d5c-1a5c-4300-ab15-e73f78388f0e {{(pid=70975) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 18 16:14:16 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Acquiring lock "refresh_cache-1b530349-680e-4def-86ef-29c340efa175" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:14:16 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Acquired lock "refresh_cache-1b530349-680e-4def-86ef-29c340efa175" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:14:16 user nova-compute[70975]: DEBUG nova.network.neutron [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] [instance: 1b530349-680e-4def-86ef-29c340efa175] Building network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 18 16:14:16 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Acquiring lock "refresh_cache-d7a293bf-a9bd-424e-ba11-bbed7dfea41c" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:14:16 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Acquired lock "refresh_cache-d7a293bf-a9bd-424e-ba11-bbed7dfea41c" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:14:16 user nova-compute[70975]: DEBUG nova.network.neutron [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Building network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 18 16:14:16 user nova-compute[70975]: DEBUG nova.compute.manager [req-a5efd8f1-9042-4415-afa3-7880d2cabba5 req-00f5ca6f-de27-46ec-b2e1-0d68863f90f0 service nova] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Received event network-changed-e5d69d5c-1a5c-4300-ab15-e73f78388f0e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:14:16 user nova-compute[70975]: DEBUG nova.compute.manager [req-a5efd8f1-9042-4415-afa3-7880d2cabba5 req-00f5ca6f-de27-46ec-b2e1-0d68863f90f0 service nova] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Refreshing instance network info cache due to event network-changed-e5d69d5c-1a5c-4300-ab15-e73f78388f0e. {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 18 16:14:16 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-a5efd8f1-9042-4415-afa3-7880d2cabba5 req-00f5ca6f-de27-46ec-b2e1-0d68863f90f0 service nova] Acquiring lock "refresh_cache-d7a293bf-a9bd-424e-ba11-bbed7dfea41c" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:14:16 user nova-compute[70975]: DEBUG nova.network.neutron [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] [instance: 1b530349-680e-4def-86ef-29c340efa175] Instance cache missing network info. {{(pid=70975) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 18 16:14:16 user nova-compute[70975]: DEBUG nova.compute.manager [req-1de86017-8058-4f6e-ada5-00188ee43278 req-59f05442-88ba-43dd-8884-5b43c42d4961 service nova] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Received event network-vif-plugged-894e80db-f051-4b32-adc8-e3afa321eb34 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:14:16 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-1de86017-8058-4f6e-ada5-00188ee43278 req-59f05442-88ba-43dd-8884-5b43c42d4961 service nova] Acquiring lock "da82d905-1ca1-403d-9598-7561e69b9704-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:16 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-1de86017-8058-4f6e-ada5-00188ee43278 req-59f05442-88ba-43dd-8884-5b43c42d4961 service nova] Lock "da82d905-1ca1-403d-9598-7561e69b9704-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:16 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-1de86017-8058-4f6e-ada5-00188ee43278 req-59f05442-88ba-43dd-8884-5b43c42d4961 service nova] Lock "da82d905-1ca1-403d-9598-7561e69b9704-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:16 user nova-compute[70975]: DEBUG nova.compute.manager [req-1de86017-8058-4f6e-ada5-00188ee43278 req-59f05442-88ba-43dd-8884-5b43c42d4961 service nova] [instance: da82d905-1ca1-403d-9598-7561e69b9704] No waiting events found dispatching network-vif-plugged-894e80db-f051-4b32-adc8-e3afa321eb34 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:14:16 user nova-compute[70975]: WARNING nova.compute.manager [req-1de86017-8058-4f6e-ada5-00188ee43278 req-59f05442-88ba-43dd-8884-5b43c42d4961 service nova] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Received unexpected event network-vif-plugged-894e80db-f051-4b32-adc8-e3afa321eb34 for instance with vm_state building and task_state spawning. Apr 18 16:14:16 user nova-compute[70975]: DEBUG nova.network.neutron [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Instance cache missing network info. {{(pid=70975) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.network.neutron [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] [instance: 1b530349-680e-4def-86ef-29c340efa175] Updating instance_info_cache with network_info: [{"id": "64d26c20-add4-4a63-bace-6a3678032692", "address": "fa:16:3e:33:ec:46", "network": {"id": "f5beaf4a-eeaf-454b-bde5-dd5e1f15e9dd", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-215585786-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "caa61b19cc4e4cd4bb7d41291c40ef1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap64d26c20-ad", "ovs_interfaceid": "64d26c20-add4-4a63-bace-6a3678032692", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Releasing lock "refresh_cache-1b530349-680e-4def-86ef-29c340efa175" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.compute.manager [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] [instance: 1b530349-680e-4def-86ef-29c340efa175] Instance network_info: |[{"id": "64d26c20-add4-4a63-bace-6a3678032692", "address": "fa:16:3e:33:ec:46", "network": {"id": "f5beaf4a-eeaf-454b-bde5-dd5e1f15e9dd", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-215585786-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "caa61b19cc4e4cd4bb7d41291c40ef1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap64d26c20-ad", "ovs_interfaceid": "64d26c20-add4-4a63-bace-6a3678032692", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] [instance: 1b530349-680e-4def-86ef-29c340efa175] Start _get_guest_xml network_info=[{"id": "64d26c20-add4-4a63-bace-6a3678032692", "address": "fa:16:3e:33:ec:46", "network": {"id": "f5beaf4a-eeaf-454b-bde5-dd5e1f15e9dd", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-215585786-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "caa61b19cc4e4cd4bb7d41291c40ef1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap64d26c20-ad", "ovs_interfaceid": "64d26c20-add4-4a63-bace-6a3678032692", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encrypted': False, 'device_type': 'disk', 'encryption_secret_uuid': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'b11a20de-f82a-4158-b53e-0a0c7a1552cb'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 18 16:14:17 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:14:17 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70975) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-18T16:11:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=), allow threads: True {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Flavor limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Image limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Flavor pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Image pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Got 1 possible topologies {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:14:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1074846308',display_name='tempest-ServerActionsTestJSON-server-1074846308',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serveractionstestjson-server-1074846308',id=3,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAYfshyMt8uY2q2eUCQtkPkI3nGNlhhmmc9vp/6UdeXopca0J7dByvvp0JsnsKIVnxALXrFdF6MbHDsrQpV6fGcr4UECEAJuS6I1V5v6lY3+aDsuDcDzQvqgi06XGLFiPA==',key_name='tempest-keypair-228479226',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='caa61b19cc4e4cd4bb7d41291c40ef1f',ramdisk_id='',reservation_id='r-1w99jwsl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1239704997',owner_user_name='tempest-ServerActionsTestJSON-1239704997-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:14:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='07b7b9d8fdcf42f29e83e755f4f27380',uuid=1b530349-680e-4def-86ef-29c340efa175,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "64d26c20-add4-4a63-bace-6a3678032692", "address": "fa:16:3e:33:ec:46", "network": {"id": "f5beaf4a-eeaf-454b-bde5-dd5e1f15e9dd", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-215585786-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "caa61b19cc4e4cd4bb7d41291c40ef1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap64d26c20-ad", "ovs_interfaceid": "64d26c20-add4-4a63-bace-6a3678032692", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70975) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Converting VIF {"id": "64d26c20-add4-4a63-bace-6a3678032692", "address": "fa:16:3e:33:ec:46", "network": {"id": "f5beaf4a-eeaf-454b-bde5-dd5e1f15e9dd", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-215585786-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "caa61b19cc4e4cd4bb7d41291c40ef1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap64d26c20-ad", "ovs_interfaceid": "64d26c20-add4-4a63-bace-6a3678032692", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:33:ec:46,bridge_name='br-int',has_traffic_filtering=True,id=64d26c20-add4-4a63-bace-6a3678032692,network=Network(f5beaf4a-eeaf-454b-bde5-dd5e1f15e9dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap64d26c20-ad') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.objects.instance [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Lazy-loading 'pci_devices' on Instance uuid 1b530349-680e-4def-86ef-29c340efa175 {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] [instance: 1b530349-680e-4def-86ef-29c340efa175] End _get_guest_xml xml= Apr 18 16:14:17 user nova-compute[70975]: 1b530349-680e-4def-86ef-29c340efa175 Apr 18 16:14:17 user nova-compute[70975]: instance-00000003 Apr 18 16:14:17 user nova-compute[70975]: 131072 Apr 18 16:14:17 user nova-compute[70975]: 1 Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: tempest-ServerActionsTestJSON-server-1074846308 Apr 18 16:14:17 user nova-compute[70975]: 2023-04-18 16:14:17 Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: 128 Apr 18 16:14:17 user nova-compute[70975]: 1 Apr 18 16:14:17 user nova-compute[70975]: 0 Apr 18 16:14:17 user nova-compute[70975]: 0 Apr 18 16:14:17 user nova-compute[70975]: 1 Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: tempest-ServerActionsTestJSON-1239704997-project-member Apr 18 16:14:17 user nova-compute[70975]: tempest-ServerActionsTestJSON-1239704997 Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: OpenStack Foundation Apr 18 16:14:17 user nova-compute[70975]: OpenStack Nova Apr 18 16:14:17 user nova-compute[70975]: 0.0.0 Apr 18 16:14:17 user nova-compute[70975]: 1b530349-680e-4def-86ef-29c340efa175 Apr 18 16:14:17 user nova-compute[70975]: 1b530349-680e-4def-86ef-29c340efa175 Apr 18 16:14:17 user nova-compute[70975]: Virtual Machine Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: hvm Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Nehalem Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: /dev/urandom Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:14:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1074846308',display_name='tempest-ServerActionsTestJSON-server-1074846308',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serveractionstestjson-server-1074846308',id=3,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAYfshyMt8uY2q2eUCQtkPkI3nGNlhhmmc9vp/6UdeXopca0J7dByvvp0JsnsKIVnxALXrFdF6MbHDsrQpV6fGcr4UECEAJuS6I1V5v6lY3+aDsuDcDzQvqgi06XGLFiPA==',key_name='tempest-keypair-228479226',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='caa61b19cc4e4cd4bb7d41291c40ef1f',ramdisk_id='',reservation_id='r-1w99jwsl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1239704997',owner_user_name='tempest-ServerActionsTestJSON-1239704997-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:14:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='07b7b9d8fdcf42f29e83e755f4f27380',uuid=1b530349-680e-4def-86ef-29c340efa175,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "64d26c20-add4-4a63-bace-6a3678032692", "address": "fa:16:3e:33:ec:46", "network": {"id": "f5beaf4a-eeaf-454b-bde5-dd5e1f15e9dd", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-215585786-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "caa61b19cc4e4cd4bb7d41291c40ef1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap64d26c20-ad", "ovs_interfaceid": "64d26c20-add4-4a63-bace-6a3678032692", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Converting VIF {"id": "64d26c20-add4-4a63-bace-6a3678032692", "address": "fa:16:3e:33:ec:46", "network": {"id": "f5beaf4a-eeaf-454b-bde5-dd5e1f15e9dd", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-215585786-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "caa61b19cc4e4cd4bb7d41291c40ef1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap64d26c20-ad", "ovs_interfaceid": "64d26c20-add4-4a63-bace-6a3678032692", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:33:ec:46,bridge_name='br-int',has_traffic_filtering=True,id=64d26c20-add4-4a63-bace-6a3678032692,network=Network(f5beaf4a-eeaf-454b-bde5-dd5e1f15e9dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap64d26c20-ad') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG os_vif [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:33:ec:46,bridge_name='br-int',has_traffic_filtering=True,id=64d26c20-add4-4a63-bace-6a3678032692,network=Network(f5beaf4a-eeaf-454b-bde5-dd5e1f15e9dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap64d26c20-ad') {{(pid=70975) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap64d26c20-ad, may_exist=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap64d26c20-ad, col_values=(('external_ids', {'iface-id': '64d26c20-add4-4a63-bace-6a3678032692', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:33:ec:46', 'vm-uuid': '1b530349-680e-4def-86ef-29c340efa175'}),)) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:17 user nova-compute[70975]: INFO os_vif [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:33:ec:46,bridge_name='br-int',has_traffic_filtering=True,id=64d26c20-add4-4a63-bace-6a3678032692,network=Network(f5beaf4a-eeaf-454b-bde5-dd5e1f15e9dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap64d26c20-ad') Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] No BDM found with device name vda, not building metadata. {{(pid=70975) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] No VIF found with MAC fa:16:3e:33:ec:46, not building metadata {{(pid=70975) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.network.neutron [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Updating instance_info_cache with network_info: [{"id": "e5d69d5c-1a5c-4300-ab15-e73f78388f0e", "address": "fa:16:3e:92:2d:7f", "network": {"id": "1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1814061150-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "261e8ba82d9e4203917afb0241a3b4fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape5d69d5c-1a", "ovs_interfaceid": "e5d69d5c-1a5c-4300-ab15-e73f78388f0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Releasing lock "refresh_cache-d7a293bf-a9bd-424e-ba11-bbed7dfea41c" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.compute.manager [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Instance network_info: |[{"id": "e5d69d5c-1a5c-4300-ab15-e73f78388f0e", "address": "fa:16:3e:92:2d:7f", "network": {"id": "1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1814061150-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "261e8ba82d9e4203917afb0241a3b4fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape5d69d5c-1a", "ovs_interfaceid": "e5d69d5c-1a5c-4300-ab15-e73f78388f0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-a5efd8f1-9042-4415-afa3-7880d2cabba5 req-00f5ca6f-de27-46ec-b2e1-0d68863f90f0 service nova] Acquired lock "refresh_cache-d7a293bf-a9bd-424e-ba11-bbed7dfea41c" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.network.neutron [req-a5efd8f1-9042-4415-afa3-7880d2cabba5 req-00f5ca6f-de27-46ec-b2e1-0d68863f90f0 service nova] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Refreshing network info cache for port e5d69d5c-1a5c-4300-ab15-e73f78388f0e {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Start _get_guest_xml network_info=[{"id": "e5d69d5c-1a5c-4300-ab15-e73f78388f0e", "address": "fa:16:3e:92:2d:7f", "network": {"id": "1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1814061150-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "261e8ba82d9e4203917afb0241a3b4fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape5d69d5c-1a", "ovs_interfaceid": "e5d69d5c-1a5c-4300-ab15-e73f78388f0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encrypted': False, 'device_type': 'disk', 'encryption_secret_uuid': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'b11a20de-f82a-4158-b53e-0a0c7a1552cb'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 18 16:14:17 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:14:17 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70975) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-18T16:11:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=), allow threads: True {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Flavor limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Image limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Flavor pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Image pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Got 1 possible topologies {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:14:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1351031695',display_name='tempest-ServerRescueNegativeTestJSON-server-1351031695',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-1351031695',id=4,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='261e8ba82d9e4203917afb0241a3b4fc',ramdisk_id='',reservation_id='r-aw8jyd7h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-1586888284',owner_user_name='tempest-ServerRescueNegativeTestJSON-1586888284-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:14:11Z,user_data=None,user_id='a8a3f45f9c6c431781fb582b8da22b0b',uuid=d7a293bf-a9bd-424e-ba11-bbed7dfea41c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e5d69d5c-1a5c-4300-ab15-e73f78388f0e", "address": "fa:16:3e:92:2d:7f", "network": {"id": "1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1814061150-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "261e8ba82d9e4203917afb0241a3b4fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape5d69d5c-1a", "ovs_interfaceid": "e5d69d5c-1a5c-4300-ab15-e73f78388f0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70975) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Converting VIF {"id": "e5d69d5c-1a5c-4300-ab15-e73f78388f0e", "address": "fa:16:3e:92:2d:7f", "network": {"id": "1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1814061150-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "261e8ba82d9e4203917afb0241a3b4fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape5d69d5c-1a", "ovs_interfaceid": "e5d69d5c-1a5c-4300-ab15-e73f78388f0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:92:2d:7f,bridge_name='br-int',has_traffic_filtering=True,id=e5d69d5c-1a5c-4300-ab15-e73f78388f0e,network=Network(1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5d69d5c-1a') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.objects.instance [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lazy-loading 'pci_devices' on Instance uuid d7a293bf-a9bd-424e-ba11-bbed7dfea41c {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] End _get_guest_xml xml= Apr 18 16:14:17 user nova-compute[70975]: d7a293bf-a9bd-424e-ba11-bbed7dfea41c Apr 18 16:14:17 user nova-compute[70975]: instance-00000004 Apr 18 16:14:17 user nova-compute[70975]: 131072 Apr 18 16:14:17 user nova-compute[70975]: 1 Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: tempest-ServerRescueNegativeTestJSON-server-1351031695 Apr 18 16:14:17 user nova-compute[70975]: 2023-04-18 16:14:17 Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: 128 Apr 18 16:14:17 user nova-compute[70975]: 1 Apr 18 16:14:17 user nova-compute[70975]: 0 Apr 18 16:14:17 user nova-compute[70975]: 0 Apr 18 16:14:17 user nova-compute[70975]: 1 Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: tempest-ServerRescueNegativeTestJSON-1586888284-project-member Apr 18 16:14:17 user nova-compute[70975]: tempest-ServerRescueNegativeTestJSON-1586888284 Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: OpenStack Foundation Apr 18 16:14:17 user nova-compute[70975]: OpenStack Nova Apr 18 16:14:17 user nova-compute[70975]: 0.0.0 Apr 18 16:14:17 user nova-compute[70975]: d7a293bf-a9bd-424e-ba11-bbed7dfea41c Apr 18 16:14:17 user nova-compute[70975]: d7a293bf-a9bd-424e-ba11-bbed7dfea41c Apr 18 16:14:17 user nova-compute[70975]: Virtual Machine Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: hvm Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Nehalem Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: /dev/urandom Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: Apr 18 16:14:17 user nova-compute[70975]: {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:14:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1351031695',display_name='tempest-ServerRescueNegativeTestJSON-server-1351031695',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-1351031695',id=4,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='261e8ba82d9e4203917afb0241a3b4fc',ramdisk_id='',reservation_id='r-aw8jyd7h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-1586888284',owner_user_name='tempest-ServerRescueNegativeTestJSON-1586888284-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:14:11Z,user_data=None,user_id='a8a3f45f9c6c431781fb582b8da22b0b',uuid=d7a293bf-a9bd-424e-ba11-bbed7dfea41c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e5d69d5c-1a5c-4300-ab15-e73f78388f0e", "address": "fa:16:3e:92:2d:7f", "network": {"id": "1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1814061150-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "261e8ba82d9e4203917afb0241a3b4fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape5d69d5c-1a", "ovs_interfaceid": "e5d69d5c-1a5c-4300-ab15-e73f78388f0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Converting VIF {"id": "e5d69d5c-1a5c-4300-ab15-e73f78388f0e", "address": "fa:16:3e:92:2d:7f", "network": {"id": "1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1814061150-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "261e8ba82d9e4203917afb0241a3b4fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape5d69d5c-1a", "ovs_interfaceid": "e5d69d5c-1a5c-4300-ab15-e73f78388f0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:92:2d:7f,bridge_name='br-int',has_traffic_filtering=True,id=e5d69d5c-1a5c-4300-ab15-e73f78388f0e,network=Network(1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5d69d5c-1a') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG os_vif [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:92:2d:7f,bridge_name='br-int',has_traffic_filtering=True,id=e5d69d5c-1a5c-4300-ab15-e73f78388f0e,network=Network(1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5d69d5c-1a') {{(pid=70975) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape5d69d5c-1a, may_exist=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape5d69d5c-1a, col_values=(('external_ids', {'iface-id': 'e5d69d5c-1a5c-4300-ab15-e73f78388f0e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:92:2d:7f', 'vm-uuid': 'd7a293bf-a9bd-424e-ba11-bbed7dfea41c'}),)) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:17 user nova-compute[70975]: INFO os_vif [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:92:2d:7f,bridge_name='br-int',has_traffic_filtering=True,id=e5d69d5c-1a5c-4300-ab15-e73f78388f0e,network=Network(1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5d69d5c-1a') Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] No BDM found with device name vda, not building metadata. {{(pid=70975) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 18 16:14:17 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] No VIF found with MAC fa:16:3e:92:2d:7f, not building metadata {{(pid=70975) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 18 16:14:18 user nova-compute[70975]: DEBUG nova.network.neutron [req-a5efd8f1-9042-4415-afa3-7880d2cabba5 req-00f5ca6f-de27-46ec-b2e1-0d68863f90f0 service nova] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Updated VIF entry in instance network info cache for port e5d69d5c-1a5c-4300-ab15-e73f78388f0e. {{(pid=70975) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 18 16:14:18 user nova-compute[70975]: DEBUG nova.network.neutron [req-a5efd8f1-9042-4415-afa3-7880d2cabba5 req-00f5ca6f-de27-46ec-b2e1-0d68863f90f0 service nova] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Updating instance_info_cache with network_info: [{"id": "e5d69d5c-1a5c-4300-ab15-e73f78388f0e", "address": "fa:16:3e:92:2d:7f", "network": {"id": "1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1814061150-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "261e8ba82d9e4203917afb0241a3b4fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape5d69d5c-1a", "ovs_interfaceid": "e5d69d5c-1a5c-4300-ab15-e73f78388f0e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:14:18 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-a5efd8f1-9042-4415-afa3-7880d2cabba5 req-00f5ca6f-de27-46ec-b2e1-0d68863f90f0 service nova] Releasing lock "refresh_cache-d7a293bf-a9bd-424e-ba11-bbed7dfea41c" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:14:18 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Resumed> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:14:18 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: da82d905-1ca1-403d-9598-7561e69b9704] VM Resumed (Lifecycle Event) Apr 18 16:14:18 user nova-compute[70975]: DEBUG nova.compute.manager [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Instance event wait completed in 0 seconds for {{(pid=70975) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 18 16:14:18 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Guest created on hypervisor {{(pid=70975) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 18 16:14:18 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:14:18 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Instance spawned successfully. Apr 18 16:14:18 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 18 16:14:18 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:14:18 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Found default for hw_cdrom_bus of ide {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:18 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Found default for hw_disk_bus of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:18 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Found default for hw_input_bus of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:18 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Found default for hw_pointer_model of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:18 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Found default for hw_video_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:18 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Found default for hw_vif_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:18 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: da82d905-1ca1-403d-9598-7561e69b9704] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:14:18 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Started> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:14:18 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: da82d905-1ca1-403d-9598-7561e69b9704] VM Started (Lifecycle Event) Apr 18 16:14:18 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:14:18 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:14:18 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: da82d905-1ca1-403d-9598-7561e69b9704] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:14:18 user nova-compute[70975]: INFO nova.compute.manager [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Took 9.79 seconds to spawn the instance on the hypervisor. Apr 18 16:14:18 user nova-compute[70975]: DEBUG nova.compute.manager [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:14:18 user nova-compute[70975]: INFO nova.compute.manager [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Took 10.49 seconds to build instance. Apr 18 16:14:18 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-60c74ee2-175e-47f0-aaf6-39717b9dae13 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lock "da82d905-1ca1-403d-9598-7561e69b9704" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.642s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:18 user nova-compute[70975]: DEBUG nova.compute.manager [req-f9df61f1-0309-46cf-a640-5598e81e0ffc req-4c0c83e4-43cb-49f6-a440-d4469b16ad36 service nova] [instance: 1b530349-680e-4def-86ef-29c340efa175] Received event network-changed-64d26c20-add4-4a63-bace-6a3678032692 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:14:18 user nova-compute[70975]: DEBUG nova.compute.manager [req-f9df61f1-0309-46cf-a640-5598e81e0ffc req-4c0c83e4-43cb-49f6-a440-d4469b16ad36 service nova] [instance: 1b530349-680e-4def-86ef-29c340efa175] Refreshing instance network info cache due to event network-changed-64d26c20-add4-4a63-bace-6a3678032692. {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 18 16:14:18 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-f9df61f1-0309-46cf-a640-5598e81e0ffc req-4c0c83e4-43cb-49f6-a440-d4469b16ad36 service nova] Acquiring lock "refresh_cache-1b530349-680e-4def-86ef-29c340efa175" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:14:18 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-f9df61f1-0309-46cf-a640-5598e81e0ffc req-4c0c83e4-43cb-49f6-a440-d4469b16ad36 service nova] Acquired lock "refresh_cache-1b530349-680e-4def-86ef-29c340efa175" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:14:18 user nova-compute[70975]: DEBUG nova.network.neutron [req-f9df61f1-0309-46cf-a640-5598e81e0ffc req-4c0c83e4-43cb-49f6-a440-d4469b16ad36 service nova] [instance: 1b530349-680e-4def-86ef-29c340efa175] Refreshing network info cache for port 64d26c20-add4-4a63-bace-6a3678032692 {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 18 16:14:19 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:19 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:19 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:19 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:20 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:20 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:20 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:20 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:20 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:20 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:20 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:21 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:21 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:21 user nova-compute[70975]: DEBUG nova.network.neutron [req-f9df61f1-0309-46cf-a640-5598e81e0ffc req-4c0c83e4-43cb-49f6-a440-d4469b16ad36 service nova] [instance: 1b530349-680e-4def-86ef-29c340efa175] Updated VIF entry in instance network info cache for port 64d26c20-add4-4a63-bace-6a3678032692. {{(pid=70975) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 18 16:14:21 user nova-compute[70975]: DEBUG nova.network.neutron [req-f9df61f1-0309-46cf-a640-5598e81e0ffc req-4c0c83e4-43cb-49f6-a440-d4469b16ad36 service nova] [instance: 1b530349-680e-4def-86ef-29c340efa175] Updating instance_info_cache with network_info: [{"id": "64d26c20-add4-4a63-bace-6a3678032692", "address": "fa:16:3e:33:ec:46", "network": {"id": "f5beaf4a-eeaf-454b-bde5-dd5e1f15e9dd", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-215585786-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "caa61b19cc4e4cd4bb7d41291c40ef1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap64d26c20-ad", "ovs_interfaceid": "64d26c20-add4-4a63-bace-6a3678032692", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:14:21 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-f9df61f1-0309-46cf-a640-5598e81e0ffc req-4c0c83e4-43cb-49f6-a440-d4469b16ad36 service nova] Releasing lock "refresh_cache-1b530349-680e-4def-86ef-29c340efa175" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:14:21 user nova-compute[70975]: DEBUG nova.compute.manager [req-f9df61f1-0309-46cf-a640-5598e81e0ffc req-4c0c83e4-43cb-49f6-a440-d4469b16ad36 service nova] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Received event network-vif-plugged-894e80db-f051-4b32-adc8-e3afa321eb34 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:14:21 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-f9df61f1-0309-46cf-a640-5598e81e0ffc req-4c0c83e4-43cb-49f6-a440-d4469b16ad36 service nova] Acquiring lock "da82d905-1ca1-403d-9598-7561e69b9704-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:21 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-f9df61f1-0309-46cf-a640-5598e81e0ffc req-4c0c83e4-43cb-49f6-a440-d4469b16ad36 service nova] Lock "da82d905-1ca1-403d-9598-7561e69b9704-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:21 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-f9df61f1-0309-46cf-a640-5598e81e0ffc req-4c0c83e4-43cb-49f6-a440-d4469b16ad36 service nova] Lock "da82d905-1ca1-403d-9598-7561e69b9704-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:21 user nova-compute[70975]: DEBUG nova.compute.manager [req-f9df61f1-0309-46cf-a640-5598e81e0ffc req-4c0c83e4-43cb-49f6-a440-d4469b16ad36 service nova] [instance: da82d905-1ca1-403d-9598-7561e69b9704] No waiting events found dispatching network-vif-plugged-894e80db-f051-4b32-adc8-e3afa321eb34 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:14:21 user nova-compute[70975]: WARNING nova.compute.manager [req-f9df61f1-0309-46cf-a640-5598e81e0ffc req-4c0c83e4-43cb-49f6-a440-d4469b16ad36 service nova] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Received unexpected event network-vif-plugged-894e80db-f051-4b32-adc8-e3afa321eb34 for instance with vm_state active and task_state None. Apr 18 16:14:21 user nova-compute[70975]: DEBUG nova.compute.manager [req-17d8bb86-55e0-4757-a02c-696c71ac3c3f req-8db9bfc9-ada0-4e3b-b122-8f7ea577f27d service nova] [instance: 1b530349-680e-4def-86ef-29c340efa175] Received event network-vif-plugged-64d26c20-add4-4a63-bace-6a3678032692 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:14:21 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-17d8bb86-55e0-4757-a02c-696c71ac3c3f req-8db9bfc9-ada0-4e3b-b122-8f7ea577f27d service nova] Acquiring lock "1b530349-680e-4def-86ef-29c340efa175-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:21 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-17d8bb86-55e0-4757-a02c-696c71ac3c3f req-8db9bfc9-ada0-4e3b-b122-8f7ea577f27d service nova] Lock "1b530349-680e-4def-86ef-29c340efa175-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:21 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-17d8bb86-55e0-4757-a02c-696c71ac3c3f req-8db9bfc9-ada0-4e3b-b122-8f7ea577f27d service nova] Lock "1b530349-680e-4def-86ef-29c340efa175-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.007s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:21 user nova-compute[70975]: DEBUG nova.compute.manager [req-17d8bb86-55e0-4757-a02c-696c71ac3c3f req-8db9bfc9-ada0-4e3b-b122-8f7ea577f27d service nova] [instance: 1b530349-680e-4def-86ef-29c340efa175] No waiting events found dispatching network-vif-plugged-64d26c20-add4-4a63-bace-6a3678032692 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:14:21 user nova-compute[70975]: WARNING nova.compute.manager [req-17d8bb86-55e0-4757-a02c-696c71ac3c3f req-8db9bfc9-ada0-4e3b-b122-8f7ea577f27d service nova] [instance: 1b530349-680e-4def-86ef-29c340efa175] Received unexpected event network-vif-plugged-64d26c20-add4-4a63-bace-6a3678032692 for instance with vm_state building and task_state spawning. Apr 18 16:14:21 user nova-compute[70975]: DEBUG nova.compute.manager [req-bb3d7774-b21f-4f6f-82c1-92bcc05a3505 req-cdcf5883-4058-4a9e-bdf4-9313ae926d9c service nova] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Received event network-vif-plugged-e5d69d5c-1a5c-4300-ab15-e73f78388f0e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:14:21 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-bb3d7774-b21f-4f6f-82c1-92bcc05a3505 req-cdcf5883-4058-4a9e-bdf4-9313ae926d9c service nova] Acquiring lock "d7a293bf-a9bd-424e-ba11-bbed7dfea41c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:21 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-bb3d7774-b21f-4f6f-82c1-92bcc05a3505 req-cdcf5883-4058-4a9e-bdf4-9313ae926d9c service nova] Lock "d7a293bf-a9bd-424e-ba11-bbed7dfea41c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:21 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-bb3d7774-b21f-4f6f-82c1-92bcc05a3505 req-cdcf5883-4058-4a9e-bdf4-9313ae926d9c service nova] Lock "d7a293bf-a9bd-424e-ba11-bbed7dfea41c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:21 user nova-compute[70975]: DEBUG nova.compute.manager [req-bb3d7774-b21f-4f6f-82c1-92bcc05a3505 req-cdcf5883-4058-4a9e-bdf4-9313ae926d9c service nova] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] No waiting events found dispatching network-vif-plugged-e5d69d5c-1a5c-4300-ab15-e73f78388f0e {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:14:21 user nova-compute[70975]: WARNING nova.compute.manager [req-bb3d7774-b21f-4f6f-82c1-92bcc05a3505 req-cdcf5883-4058-4a9e-bdf4-9313ae926d9c service nova] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Received unexpected event network-vif-plugged-e5d69d5c-1a5c-4300-ab15-e73f78388f0e for instance with vm_state building and task_state spawning. Apr 18 16:14:22 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:22 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:22 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Acquiring lock "993d062c-8462-4534-bcde-9249779d4e90" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "993d062c-8462-4534-bcde-9249779d4e90" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.compute.manager [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Starting instance... {{(pid=70975) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.003s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70975) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 18 16:14:23 user nova-compute[70975]: INFO nova.compute.claims [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Claim successful on node user Apr 18 16:14:23 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.compute.manager [req-2630565b-281c-4e9a-8408-2e43131fe47b req-660b440b-c064-4a0c-a29f-935b63a1ed79 service nova] [instance: 1b530349-680e-4def-86ef-29c340efa175] Received event network-vif-plugged-64d26c20-add4-4a63-bace-6a3678032692 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-2630565b-281c-4e9a-8408-2e43131fe47b req-660b440b-c064-4a0c-a29f-935b63a1ed79 service nova] Acquiring lock "1b530349-680e-4def-86ef-29c340efa175-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-2630565b-281c-4e9a-8408-2e43131fe47b req-660b440b-c064-4a0c-a29f-935b63a1ed79 service nova] Lock "1b530349-680e-4def-86ef-29c340efa175-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-2630565b-281c-4e9a-8408-2e43131fe47b req-660b440b-c064-4a0c-a29f-935b63a1ed79 service nova] Lock "1b530349-680e-4def-86ef-29c340efa175-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.compute.manager [req-2630565b-281c-4e9a-8408-2e43131fe47b req-660b440b-c064-4a0c-a29f-935b63a1ed79 service nova] [instance: 1b530349-680e-4def-86ef-29c340efa175] No waiting events found dispatching network-vif-plugged-64d26c20-add4-4a63-bace-6a3678032692 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:14:23 user nova-compute[70975]: WARNING nova.compute.manager [req-2630565b-281c-4e9a-8408-2e43131fe47b req-660b440b-c064-4a0c-a29f-935b63a1ed79 service nova] [instance: 1b530349-680e-4def-86ef-29c340efa175] Received unexpected event network-vif-plugged-64d26c20-add4-4a63-bace-6a3678032692 for instance with vm_state building and task_state spawning. Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.compute.manager [req-ccae80fc-9645-46e6-92a7-107c59605b78 req-209fd942-01f7-4690-8eb8-9f77203c3320 service nova] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Received event network-vif-plugged-e5d69d5c-1a5c-4300-ab15-e73f78388f0e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-ccae80fc-9645-46e6-92a7-107c59605b78 req-209fd942-01f7-4690-8eb8-9f77203c3320 service nova] Acquiring lock "d7a293bf-a9bd-424e-ba11-bbed7dfea41c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-ccae80fc-9645-46e6-92a7-107c59605b78 req-209fd942-01f7-4690-8eb8-9f77203c3320 service nova] Lock "d7a293bf-a9bd-424e-ba11-bbed7dfea41c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-ccae80fc-9645-46e6-92a7-107c59605b78 req-209fd942-01f7-4690-8eb8-9f77203c3320 service nova] Lock "d7a293bf-a9bd-424e-ba11-bbed7dfea41c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.compute.manager [req-ccae80fc-9645-46e6-92a7-107c59605b78 req-209fd942-01f7-4690-8eb8-9f77203c3320 service nova] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] No waiting events found dispatching network-vif-plugged-e5d69d5c-1a5c-4300-ab15-e73f78388f0e {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:14:23 user nova-compute[70975]: WARNING nova.compute.manager [req-ccae80fc-9645-46e6-92a7-107c59605b78 req-209fd942-01f7-4690-8eb8-9f77203c3320 service nova] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Received unexpected event network-vif-plugged-e5d69d5c-1a5c-4300-ab15-e73f78388f0e for instance with vm_state building and task_state spawning. Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Resumed> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:14:23 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 1b530349-680e-4def-86ef-29c340efa175] VM Resumed (Lifecycle Event) Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.compute.manager [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] [instance: 1b530349-680e-4def-86ef-29c340efa175] Instance event wait completed in 0 seconds for {{(pid=70975) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] [instance: 1b530349-680e-4def-86ef-29c340efa175] Guest created on hypervisor {{(pid=70975) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.compute.manager [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Instance event wait completed in 0 seconds for {{(pid=70975) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Guest created on hypervisor {{(pid=70975) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 18 16:14:23 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: 1b530349-680e-4def-86ef-29c340efa175] Instance spawned successfully. Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] [instance: 1b530349-680e-4def-86ef-29c340efa175] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 18 16:14:23 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Instance spawned successfully. Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 1b530349-680e-4def-86ef-29c340efa175] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 1b530349-680e-4def-86ef-29c340efa175] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Found default for hw_cdrom_bus of ide {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Found default for hw_disk_bus of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Found default for hw_input_bus of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Found default for hw_pointer_model of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Found default for hw_video_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Found default for hw_vif_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] [instance: 1b530349-680e-4def-86ef-29c340efa175] Found default for hw_cdrom_bus of ide {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] [instance: 1b530349-680e-4def-86ef-29c340efa175] Found default for hw_disk_bus of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] [instance: 1b530349-680e-4def-86ef-29c340efa175] Found default for hw_input_bus of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] [instance: 1b530349-680e-4def-86ef-29c340efa175] Found default for hw_pointer_model of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] [instance: 1b530349-680e-4def-86ef-29c340efa175] Found default for hw_video_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] [instance: 1b530349-680e-4def-86ef-29c340efa175] Found default for hw_vif_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:23 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 1b530349-680e-4def-86ef-29c340efa175] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Started> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:14:23 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 1b530349-680e-4def-86ef-29c340efa175] VM Started (Lifecycle Event) Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 1b530349-680e-4def-86ef-29c340efa175] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 1b530349-680e-4def-86ef-29c340efa175] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:14:23 user nova-compute[70975]: INFO nova.compute.manager [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] [instance: 1b530349-680e-4def-86ef-29c340efa175] Took 14.39 seconds to spawn the instance on the hypervisor. Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.compute.manager [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] [instance: 1b530349-680e-4def-86ef-29c340efa175] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:14:23 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 1b530349-680e-4def-86ef-29c340efa175] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Resumed> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:14:23 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] VM Resumed (Lifecycle Event) Apr 18 16:14:23 user nova-compute[70975]: INFO nova.compute.manager [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Took 12.35 seconds to spawn the instance on the hypervisor. Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.compute.manager [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.586s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.compute.manager [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Start building networks asynchronously for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:14:23 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Started> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:14:23 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] VM Started (Lifecycle Event) Apr 18 16:14:23 user nova-compute[70975]: INFO nova.compute.manager [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] [instance: 1b530349-680e-4def-86ef-29c340efa175] Took 15.63 seconds to build instance. Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.compute.manager [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Allocating IP information in the background. {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.network.neutron [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: 993d062c-8462-4534-bcde-9249779d4e90] allocate_for_instance() {{(pid=70975) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-dde7154f-a2cf-49d1-9157-7ced6e66e06f tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Lock "1b530349-680e-4def-86ef-29c340efa175" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.758s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:14:23 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 18 16:14:23 user nova-compute[70975]: DEBUG nova.compute.manager [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Start building block device mappings for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 18 16:14:23 user nova-compute[70975]: INFO nova.compute.manager [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Took 13.26 seconds to build instance. Apr 18 16:14:23 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-11ef488d-46c3-48cb-b234-0aa757927a21 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lock "d7a293bf-a9bd-424e-ba11-bbed7dfea41c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 13.377s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:24 user nova-compute[70975]: DEBUG nova.compute.manager [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Start spawning the instance on the hypervisor. {{(pid=70975) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 18 16:14:24 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Creating instance directory {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 18 16:14:24 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Creating image(s) Apr 18 16:14:24 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Acquiring lock "/opt/stack/data/nova/instances/993d062c-8462-4534-bcde-9249779d4e90/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:24 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "/opt/stack/data/nova/instances/993d062c-8462-4534-bcde-9249779d4e90/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:24 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "/opt/stack/data/nova/instances/993d062c-8462-4534-bcde-9249779d4e90/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.004s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:24 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:24 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.155s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:24 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Acquiring lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:24 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.004s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:24 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:24 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.153s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:24 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/993d062c-8462-4534-bcde-9249779d4e90/disk 1073741824 {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:24 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/993d062c-8462-4534-bcde-9249779d4e90/disk 1073741824" returned: 0 in 0.072s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:24 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.234s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:24 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:24 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:24 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.151s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:24 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Checking if we can resize image /opt/stack/data/nova/instances/993d062c-8462-4534-bcde-9249779d4e90/disk. size=1073741824 {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 18 16:14:24 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/993d062c-8462-4534-bcde-9249779d4e90/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:24 user nova-compute[70975]: DEBUG nova.policy [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'af90e17ec027463fa8793e8539c39e13', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6b4e8d8797be4c0e91b1401538f2eba8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70975) authorize /opt/stack/nova/nova/policy.py:203}} Apr 18 16:14:24 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/993d062c-8462-4534-bcde-9249779d4e90/disk --force-share --output=json" returned: 0 in 0.146s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:24 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Cannot resize image /opt/stack/data/nova/instances/993d062c-8462-4534-bcde-9249779d4e90/disk to a smaller size. {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 18 16:14:24 user nova-compute[70975]: DEBUG nova.objects.instance [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lazy-loading 'migration_context' on Instance uuid 993d062c-8462-4534-bcde-9249779d4e90 {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:14:24 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Created local disks {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 18 16:14:24 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Ensure instance console log exists: /opt/stack/data/nova/instances/993d062c-8462-4534-bcde-9249779d4e90/console.log {{(pid=70975) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 18 16:14:24 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:24 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:24 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:25 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:26 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:27 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:27 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:29 user nova-compute[70975]: DEBUG nova.network.neutron [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Successfully created port: fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3 {{(pid=70975) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 18 16:14:32 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:32 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:32 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:34 user nova-compute[70975]: DEBUG nova.network.neutron [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Successfully updated port: fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3 {{(pid=70975) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 18 16:14:34 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Acquiring lock "refresh_cache-993d062c-8462-4534-bcde-9249779d4e90" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:14:34 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Acquired lock "refresh_cache-993d062c-8462-4534-bcde-9249779d4e90" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:14:34 user nova-compute[70975]: DEBUG nova.network.neutron [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Building network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 18 16:14:34 user nova-compute[70975]: DEBUG nova.compute.manager [req-41f89b3c-a178-48a1-846d-dde105f193b3 req-20b65c34-9d73-41fe-a9cc-ae69903da943 service nova] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Received event network-changed-fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:14:34 user nova-compute[70975]: DEBUG nova.compute.manager [req-41f89b3c-a178-48a1-846d-dde105f193b3 req-20b65c34-9d73-41fe-a9cc-ae69903da943 service nova] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Refreshing instance network info cache due to event network-changed-fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3. {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 18 16:14:34 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-41f89b3c-a178-48a1-846d-dde105f193b3 req-20b65c34-9d73-41fe-a9cc-ae69903da943 service nova] Acquiring lock "refresh_cache-993d062c-8462-4534-bcde-9249779d4e90" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:14:34 user nova-compute[70975]: DEBUG nova.network.neutron [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Instance cache missing network info. {{(pid=70975) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 18 16:14:35 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG nova.network.neutron [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Updating instance_info_cache with network_info: [{"id": "fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3", "address": "fa:16:3e:d2:bc:43", "network": {"id": "02aca424-2923-404b-9c66-76bec89f82b7", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1255258227-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6b4e8d8797be4c0e91b1401538f2eba8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe3d4b7c-e1", "ovs_interfaceid": "fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Acquiring lock "aaac3797-349f-4695-bea2-8b0c022a66e0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Lock "aaac3797-349f-4695-bea2-8b0c022a66e0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Releasing lock "refresh_cache-993d062c-8462-4534-bcde-9249779d4e90" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG nova.compute.manager [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Instance network_info: |[{"id": "fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3", "address": "fa:16:3e:d2:bc:43", "network": {"id": "02aca424-2923-404b-9c66-76bec89f82b7", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1255258227-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6b4e8d8797be4c0e91b1401538f2eba8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe3d4b7c-e1", "ovs_interfaceid": "fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-41f89b3c-a178-48a1-846d-dde105f193b3 req-20b65c34-9d73-41fe-a9cc-ae69903da943 service nova] Acquired lock "refresh_cache-993d062c-8462-4534-bcde-9249779d4e90" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG nova.network.neutron [req-41f89b3c-a178-48a1-846d-dde105f193b3 req-20b65c34-9d73-41fe-a9cc-ae69903da943 service nova] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Refreshing network info cache for port fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3 {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Start _get_guest_xml network_info=[{"id": "fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3", "address": "fa:16:3e:d2:bc:43", "network": {"id": "02aca424-2923-404b-9c66-76bec89f82b7", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1255258227-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6b4e8d8797be4c0e91b1401538f2eba8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe3d4b7c-e1", "ovs_interfaceid": "fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encrypted': False, 'device_type': 'disk', 'encryption_secret_uuid': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'b11a20de-f82a-4158-b53e-0a0c7a1552cb'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG nova.compute.manager [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Starting instance... {{(pid=70975) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 18 16:14:36 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:14:36 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:14:36 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70975) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-18T16:11:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=), allow threads: True {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Flavor limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Image limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Flavor pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Image pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Got 1 possible topologies {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:14:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1278260269',display_name='tempest-AttachVolumeNegativeTest-server-1278260269',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-1278260269',id=5,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDO/CLWqaabu1PPSB6IO5u9ZPRsbyk1aJTiCtDZZM4ehxz6NX8dqpiUe00Z9Nr+BHXqhNNOtIquxOnLmyxJxZVKgMQccZdSmpkhgpRi7hndOMSE64mNrbe1QQ/t5OUkS2w==',key_name='tempest-keypair-87186417',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6b4e8d8797be4c0e91b1401538f2eba8',ramdisk_id='',reservation_id='r-5k1ey47l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-216357456',owner_user_name='tempest-AttachVolumeNegativeTest-216357456-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:14:24Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='af90e17ec027463fa8793e8539c39e13',uuid=993d062c-8462-4534-bcde-9249779d4e90,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3", "address": "fa:16:3e:d2:bc:43", "network": {"id": "02aca424-2923-404b-9c66-76bec89f82b7", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1255258227-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6b4e8d8797be4c0e91b1401538f2eba8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe3d4b7c-e1", "ovs_interfaceid": "fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70975) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Converting VIF {"id": "fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3", "address": "fa:16:3e:d2:bc:43", "network": {"id": "02aca424-2923-404b-9c66-76bec89f82b7", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1255258227-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6b4e8d8797be4c0e91b1401538f2eba8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe3d4b7c-e1", "ovs_interfaceid": "fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d2:bc:43,bridge_name='br-int',has_traffic_filtering=True,id=fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3,network=Network(02aca424-2923-404b-9c66-76bec89f82b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe3d4b7c-e1') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG nova.objects.instance [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lazy-loading 'pci_devices' on Instance uuid 993d062c-8462-4534-bcde-9249779d4e90 {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: 993d062c-8462-4534-bcde-9249779d4e90] End _get_guest_xml xml= Apr 18 16:14:36 user nova-compute[70975]: 993d062c-8462-4534-bcde-9249779d4e90 Apr 18 16:14:36 user nova-compute[70975]: instance-00000005 Apr 18 16:14:36 user nova-compute[70975]: 131072 Apr 18 16:14:36 user nova-compute[70975]: 1 Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: tempest-AttachVolumeNegativeTest-server-1278260269 Apr 18 16:14:36 user nova-compute[70975]: 2023-04-18 16:14:36 Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: 128 Apr 18 16:14:36 user nova-compute[70975]: 1 Apr 18 16:14:36 user nova-compute[70975]: 0 Apr 18 16:14:36 user nova-compute[70975]: 0 Apr 18 16:14:36 user nova-compute[70975]: 1 Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: tempest-AttachVolumeNegativeTest-216357456-project-member Apr 18 16:14:36 user nova-compute[70975]: tempest-AttachVolumeNegativeTest-216357456 Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: OpenStack Foundation Apr 18 16:14:36 user nova-compute[70975]: OpenStack Nova Apr 18 16:14:36 user nova-compute[70975]: 0.0.0 Apr 18 16:14:36 user nova-compute[70975]: 993d062c-8462-4534-bcde-9249779d4e90 Apr 18 16:14:36 user nova-compute[70975]: 993d062c-8462-4534-bcde-9249779d4e90 Apr 18 16:14:36 user nova-compute[70975]: Virtual Machine Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: hvm Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Nehalem Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: /dev/urandom Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: Apr 18 16:14:36 user nova-compute[70975]: {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:14:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1278260269',display_name='tempest-AttachVolumeNegativeTest-server-1278260269',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-1278260269',id=5,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDO/CLWqaabu1PPSB6IO5u9ZPRsbyk1aJTiCtDZZM4ehxz6NX8dqpiUe00Z9Nr+BHXqhNNOtIquxOnLmyxJxZVKgMQccZdSmpkhgpRi7hndOMSE64mNrbe1QQ/t5OUkS2w==',key_name='tempest-keypair-87186417',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6b4e8d8797be4c0e91b1401538f2eba8',ramdisk_id='',reservation_id='r-5k1ey47l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-216357456',owner_user_name='tempest-AttachVolumeNegativeTest-216357456-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:14:24Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='af90e17ec027463fa8793e8539c39e13',uuid=993d062c-8462-4534-bcde-9249779d4e90,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3", "address": "fa:16:3e:d2:bc:43", "network": {"id": "02aca424-2923-404b-9c66-76bec89f82b7", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1255258227-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6b4e8d8797be4c0e91b1401538f2eba8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe3d4b7c-e1", "ovs_interfaceid": "fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Converting VIF {"id": "fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3", "address": "fa:16:3e:d2:bc:43", "network": {"id": "02aca424-2923-404b-9c66-76bec89f82b7", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1255258227-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6b4e8d8797be4c0e91b1401538f2eba8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe3d4b7c-e1", "ovs_interfaceid": "fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d2:bc:43,bridge_name='br-int',has_traffic_filtering=True,id=fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3,network=Network(02aca424-2923-404b-9c66-76bec89f82b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe3d4b7c-e1') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG os_vif [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d2:bc:43,bridge_name='br-int',has_traffic_filtering=True,id=fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3,network=Network(02aca424-2923-404b-9c66-76bec89f82b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe3d4b7c-e1') {{(pid=70975) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfe3d4b7c-e1, may_exist=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfe3d4b7c-e1, col_values=(('external_ids', {'iface-id': 'fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d2:bc:43', 'vm-uuid': '993d062c-8462-4534-bcde-9249779d4e90'}),)) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70975) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 18 16:14:36 user nova-compute[70975]: INFO nova.compute.claims [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Claim successful on node user Apr 18 16:14:36 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:36 user nova-compute[70975]: INFO os_vif [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d2:bc:43,bridge_name='br-int',has_traffic_filtering=True,id=fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3,network=Network(02aca424-2923-404b-9c66-76bec89f82b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe3d4b7c-e1') Apr 18 16:14:36 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] No BDM found with device name vda, not building metadata. {{(pid=70975) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] No VIF found with MAC fa:16:3e:d2:bc:43, not building metadata {{(pid=70975) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 18 16:14:36 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:37 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:37 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:14:37 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:14:37 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.654s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:37 user nova-compute[70975]: DEBUG nova.compute.manager [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Start building networks asynchronously for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 18 16:14:37 user nova-compute[70975]: DEBUG nova.compute.manager [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Allocating IP information in the background. {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 18 16:14:37 user nova-compute[70975]: DEBUG nova.network.neutron [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] allocate_for_instance() {{(pid=70975) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 18 16:14:37 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Ignoring supplied device name: /dev/sda. Libvirt can't honour user-supplied dev names Apr 18 16:14:37 user nova-compute[70975]: DEBUG nova.compute.manager [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Start building block device mappings for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 18 16:14:37 user nova-compute[70975]: DEBUG nova.compute.manager [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Start spawning the instance on the hypervisor. {{(pid=70975) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 18 16:14:37 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Creating instance directory {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 18 16:14:37 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Creating image(s) Apr 18 16:14:37 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Acquiring lock "/opt/stack/data/nova/instances/aaac3797-349f-4695-bea2-8b0c022a66e0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:37 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Lock "/opt/stack/data/nova/instances/aaac3797-349f-4695-bea2-8b0c022a66e0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:37 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Lock "/opt/stack/data/nova/instances/aaac3797-349f-4695-bea2-8b0c022a66e0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:37 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Acquiring lock "b80c482fce25803c74c8bd962c6d7181a6b503de" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:37 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Lock "b80c482fce25803c74c8bd962c6d7181a6b503de" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:37 user nova-compute[70975]: DEBUG nova.policy [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ffaa9df682cb40739d1d754000e04743', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8b357cc820a04f3486f98d8e38c1a3d6', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70975) authorize /opt/stack/nova/nova/policy.py:203}} Apr 18 16:14:37 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/b80c482fce25803c74c8bd962c6d7181a6b503de.part --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:37 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:38 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/b80c482fce25803c74c8bd962c6d7181a6b503de.part --force-share --output=json" returned: 0 in 0.162s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:38 user nova-compute[70975]: DEBUG nova.virt.images [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] 2824808e-fd92-429e-ad00-18522a9ee7be was qcow2, converting to raw {{(pid=70975) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Apr 18 16:14:38 user nova-compute[70975]: DEBUG nova.privsep.utils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=70975) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 18 16:14:38 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/b80c482fce25803c74c8bd962c6d7181a6b503de.part /opt/stack/data/nova/instances/_base/b80c482fce25803c74c8bd962c6d7181a6b503de.converted {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:38 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:38 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:38 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:38 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/b80c482fce25803c74c8bd962c6d7181a6b503de.part /opt/stack/data/nova/instances/_base/b80c482fce25803c74c8bd962c6d7181a6b503de.converted" returned: 0 in 0.163s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:38 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/b80c482fce25803c74c8bd962c6d7181a6b503de.converted --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:38 user nova-compute[70975]: DEBUG nova.compute.manager [req-766b38dd-a7da-4794-953b-25d65ff73b3f req-1359f129-af7c-4af6-a38d-b0ed197131b7 service nova] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Received event network-vif-plugged-fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:14:38 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-766b38dd-a7da-4794-953b-25d65ff73b3f req-1359f129-af7c-4af6-a38d-b0ed197131b7 service nova] Acquiring lock "993d062c-8462-4534-bcde-9249779d4e90-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:38 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-766b38dd-a7da-4794-953b-25d65ff73b3f req-1359f129-af7c-4af6-a38d-b0ed197131b7 service nova] Lock "993d062c-8462-4534-bcde-9249779d4e90-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:38 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-766b38dd-a7da-4794-953b-25d65ff73b3f req-1359f129-af7c-4af6-a38d-b0ed197131b7 service nova] Lock "993d062c-8462-4534-bcde-9249779d4e90-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:38 user nova-compute[70975]: DEBUG nova.compute.manager [req-766b38dd-a7da-4794-953b-25d65ff73b3f req-1359f129-af7c-4af6-a38d-b0ed197131b7 service nova] [instance: 993d062c-8462-4534-bcde-9249779d4e90] No waiting events found dispatching network-vif-plugged-fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:14:38 user nova-compute[70975]: WARNING nova.compute.manager [req-766b38dd-a7da-4794-953b-25d65ff73b3f req-1359f129-af7c-4af6-a38d-b0ed197131b7 service nova] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Received unexpected event network-vif-plugged-fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3 for instance with vm_state building and task_state spawning. Apr 18 16:14:38 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/b80c482fce25803c74c8bd962c6d7181a6b503de.converted --force-share --output=json" returned: 0 in 0.162s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:38 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Lock "b80c482fce25803c74c8bd962c6d7181a6b503de" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.938s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:38 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/b80c482fce25803c74c8bd962c6d7181a6b503de --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:38 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/b80c482fce25803c74c8bd962c6d7181a6b503de --force-share --output=json" returned: 0 in 0.169s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:38 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Acquiring lock "b80c482fce25803c74c8bd962c6d7181a6b503de" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:38 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Lock "b80c482fce25803c74c8bd962c6d7181a6b503de" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:38 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/b80c482fce25803c74c8bd962c6d7181a6b503de --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:38 user nova-compute[70975]: DEBUG nova.network.neutron [req-41f89b3c-a178-48a1-846d-dde105f193b3 req-20b65c34-9d73-41fe-a9cc-ae69903da943 service nova] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Updated VIF entry in instance network info cache for port fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3. {{(pid=70975) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 18 16:14:38 user nova-compute[70975]: DEBUG nova.network.neutron [req-41f89b3c-a178-48a1-846d-dde105f193b3 req-20b65c34-9d73-41fe-a9cc-ae69903da943 service nova] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Updating instance_info_cache with network_info: [{"id": "fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3", "address": "fa:16:3e:d2:bc:43", "network": {"id": "02aca424-2923-404b-9c66-76bec89f82b7", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1255258227-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6b4e8d8797be4c0e91b1401538f2eba8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe3d4b7c-e1", "ovs_interfaceid": "fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:14:38 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-41f89b3c-a178-48a1-846d-dde105f193b3 req-20b65c34-9d73-41fe-a9cc-ae69903da943 service nova] Releasing lock "refresh_cache-993d062c-8462-4534-bcde-9249779d4e90" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:14:38 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/b80c482fce25803c74c8bd962c6d7181a6b503de --force-share --output=json" returned: 0 in 0.152s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:38 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/b80c482fce25803c74c8bd962c6d7181a6b503de,backing_fmt=raw /opt/stack/data/nova/instances/aaac3797-349f-4695-bea2-8b0c022a66e0/disk 1073741824 {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:38 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:38 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:38 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/b80c482fce25803c74c8bd962c6d7181a6b503de,backing_fmt=raw /opt/stack/data/nova/instances/aaac3797-349f-4695-bea2-8b0c022a66e0/disk 1073741824" returned: 0 in 0.060s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:38 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Lock "b80c482fce25803c74c8bd962c6d7181a6b503de" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.217s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:38 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/b80c482fce25803c74c8bd962c6d7181a6b503de --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:39 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/b80c482fce25803c74c8bd962c6d7181a6b503de --force-share --output=json" returned: 0 in 0.136s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:39 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Checking if we can resize image /opt/stack/data/nova/instances/aaac3797-349f-4695-bea2-8b0c022a66e0/disk. size=1073741824 {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 18 16:14:39 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/aaac3797-349f-4695-bea2-8b0c022a66e0/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:39 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/aaac3797-349f-4695-bea2-8b0c022a66e0/disk --force-share --output=json" returned: 0 in 0.166s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:39 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Cannot resize image /opt/stack/data/nova/instances/aaac3797-349f-4695-bea2-8b0c022a66e0/disk to a smaller size. {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 18 16:14:39 user nova-compute[70975]: DEBUG nova.objects.instance [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Lazy-loading 'migration_context' on Instance uuid aaac3797-349f-4695-bea2-8b0c022a66e0 {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:14:39 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Created local disks {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 18 16:14:39 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Ensure instance console log exists: /opt/stack/data/nova/instances/aaac3797-349f-4695-bea2-8b0c022a66e0/console.log {{(pid=70975) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 18 16:14:39 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:39 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:39 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:39 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:39 user nova-compute[70975]: DEBUG nova.network.neutron [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Successfully created port: f2d5008c-284e-45a5-b349-4fe0723e138e {{(pid=70975) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 18 16:14:40 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Resumed> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:14:40 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 993d062c-8462-4534-bcde-9249779d4e90] VM Resumed (Lifecycle Event) Apr 18 16:14:40 user nova-compute[70975]: DEBUG nova.compute.manager [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Instance event wait completed in 0 seconds for {{(pid=70975) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 18 16:14:40 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Guest created on hypervisor {{(pid=70975) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 18 16:14:40 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Instance spawned successfully. Apr 18 16:14:40 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 18 16:14:40 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:14:40 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:14:40 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Found default for hw_cdrom_bus of ide {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:40 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Found default for hw_disk_bus of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:40 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Found default for hw_input_bus of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:40 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Found default for hw_pointer_model of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:40 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Found default for hw_video_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:40 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Found default for hw_vif_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:40 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 993d062c-8462-4534-bcde-9249779d4e90] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:14:40 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Started> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:14:40 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 993d062c-8462-4534-bcde-9249779d4e90] VM Started (Lifecycle Event) Apr 18 16:14:40 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:14:40 user nova-compute[70975]: DEBUG nova.compute.manager [req-30bf49db-a2a2-4864-ba64-3c5988686fbe req-d05317c7-9d94-4d7f-8215-45765fcb89ce service nova] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Received event network-vif-plugged-fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:14:40 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-30bf49db-a2a2-4864-ba64-3c5988686fbe req-d05317c7-9d94-4d7f-8215-45765fcb89ce service nova] Acquiring lock "993d062c-8462-4534-bcde-9249779d4e90-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:40 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-30bf49db-a2a2-4864-ba64-3c5988686fbe req-d05317c7-9d94-4d7f-8215-45765fcb89ce service nova] Lock "993d062c-8462-4534-bcde-9249779d4e90-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.002s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:40 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-30bf49db-a2a2-4864-ba64-3c5988686fbe req-d05317c7-9d94-4d7f-8215-45765fcb89ce service nova] Lock "993d062c-8462-4534-bcde-9249779d4e90-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:40 user nova-compute[70975]: DEBUG nova.compute.manager [req-30bf49db-a2a2-4864-ba64-3c5988686fbe req-d05317c7-9d94-4d7f-8215-45765fcb89ce service nova] [instance: 993d062c-8462-4534-bcde-9249779d4e90] No waiting events found dispatching network-vif-plugged-fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:14:40 user nova-compute[70975]: WARNING nova.compute.manager [req-30bf49db-a2a2-4864-ba64-3c5988686fbe req-d05317c7-9d94-4d7f-8215-45765fcb89ce service nova] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Received unexpected event network-vif-plugged-fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3 for instance with vm_state building and task_state spawning. Apr 18 16:14:40 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:14:40 user nova-compute[70975]: INFO nova.compute.manager [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Took 16.51 seconds to spawn the instance on the hypervisor. Apr 18 16:14:40 user nova-compute[70975]: DEBUG nova.compute.manager [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:14:40 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 993d062c-8462-4534-bcde-9249779d4e90] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:14:40 user nova-compute[70975]: INFO nova.compute.manager [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Took 17.52 seconds to build instance. Apr 18 16:14:40 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c0011d44-6dfa-4442-a2cb-51e16c960bcc tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "993d062c-8462-4534-bcde-9249779d4e90" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 17.706s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:41 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:41 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:42 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:42 user nova-compute[70975]: DEBUG nova.network.neutron [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Successfully updated port: f2d5008c-284e-45a5-b349-4fe0723e138e {{(pid=70975) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 18 16:14:42 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Acquiring lock "refresh_cache-aaac3797-349f-4695-bea2-8b0c022a66e0" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:14:42 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Acquired lock "refresh_cache-aaac3797-349f-4695-bea2-8b0c022a66e0" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:14:42 user nova-compute[70975]: DEBUG nova.network.neutron [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Building network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 18 16:14:42 user nova-compute[70975]: DEBUG nova.compute.manager [req-ddd7be87-de64-434e-9ca1-c0a8b7735806 req-2627648f-205b-4f09-ab9a-f9df7c64b298 service nova] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Received event network-changed-f2d5008c-284e-45a5-b349-4fe0723e138e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:14:42 user nova-compute[70975]: DEBUG nova.compute.manager [req-ddd7be87-de64-434e-9ca1-c0a8b7735806 req-2627648f-205b-4f09-ab9a-f9df7c64b298 service nova] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Refreshing instance network info cache due to event network-changed-f2d5008c-284e-45a5-b349-4fe0723e138e. {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 18 16:14:42 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-ddd7be87-de64-434e-9ca1-c0a8b7735806 req-2627648f-205b-4f09-ab9a-f9df7c64b298 service nova] Acquiring lock "refresh_cache-aaac3797-349f-4695-bea2-8b0c022a66e0" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:14:42 user nova-compute[70975]: DEBUG nova.network.neutron [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Instance cache missing network info. {{(pid=70975) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 18 16:14:42 user nova-compute[70975]: DEBUG nova.network.neutron [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Updating instance_info_cache with network_info: [{"id": "f2d5008c-284e-45a5-b349-4fe0723e138e", "address": "fa:16:3e:76:88:8e", "network": {"id": "a8157d06-a7f6-4b9c-ae66-cd40da31eb6d", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1365454803-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8b357cc820a04f3486f98d8e38c1a3d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf2d5008c-28", "ovs_interfaceid": "f2d5008c-284e-45a5-b349-4fe0723e138e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:14:42 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Releasing lock "refresh_cache-aaac3797-349f-4695-bea2-8b0c022a66e0" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG nova.compute.manager [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Instance network_info: |[{"id": "f2d5008c-284e-45a5-b349-4fe0723e138e", "address": "fa:16:3e:76:88:8e", "network": {"id": "a8157d06-a7f6-4b9c-ae66-cd40da31eb6d", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1365454803-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8b357cc820a04f3486f98d8e38c1a3d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf2d5008c-28", "ovs_interfaceid": "f2d5008c-284e-45a5-b349-4fe0723e138e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-ddd7be87-de64-434e-9ca1-c0a8b7735806 req-2627648f-205b-4f09-ab9a-f9df7c64b298 service nova] Acquired lock "refresh_cache-aaac3797-349f-4695-bea2-8b0c022a66e0" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG nova.network.neutron [req-ddd7be87-de64-434e-9ca1-c0a8b7735806 req-2627648f-205b-4f09-ab9a-f9df7c64b298 service nova] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Refreshing network info cache for port f2d5008c-284e-45a5-b349-4fe0723e138e {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Start _get_guest_xml network_info=[{"id": "f2d5008c-284e-45a5-b349-4fe0723e138e", "address": "fa:16:3e:76:88:8e", "network": {"id": "a8157d06-a7f6-4b9c-ae66-cd40da31eb6d", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1365454803-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8b357cc820a04f3486f98d8e38c1a3d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf2d5008c-28", "ovs_interfaceid": "f2d5008c-284e-45a5-b349-4fe0723e138e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'scsi', 'cdrom_bus': 'scsi', 'mapping': {'root': {'bus': 'scsi', 'dev': 'sda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'scsi', 'dev': 'sda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'scsi', 'dev': 'sdb', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:14:29Z,direct_url=,disk_format='qcow2',id=2824808e-fd92-429e-ad00-18522a9ee7be,min_disk=0,min_ram=0,name='',owner='5cc9cf5ed3a242249c4a1adc55de66a6',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:14:31Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/sda', 'image': [{'device_name': '/dev/sda', 'size': 0, 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encrypted': False, 'device_type': 'disk', 'encryption_secret_uuid': None, 'encryption_format': None, 'disk_bus': 'scsi', 'image_id': '2824808e-fd92-429e-ad00-18522a9ee7be'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 18 16:14:43 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:14:43 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:14:43 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70975) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-18T16:11:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:14:29Z,direct_url=,disk_format='qcow2',id=2824808e-fd92-429e-ad00-18522a9ee7be,min_disk=0,min_ram=0,name='',owner='5cc9cf5ed3a242249c4a1adc55de66a6',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:14:31Z,virtual_size=,visibility=), allow threads: True {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Flavor limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Image limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Flavor pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Image pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Got 1 possible topologies {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2023-04-18T16:14:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-182924015',display_name='tempest-AttachSCSIVolumeTestJSON-server-182924015',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachscsivolumetestjson-server-182924015',id=6,image_ref='2824808e-fd92-429e-ad00-18522a9ee7be',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBONN6MRKRvMlP+yhMvoK61g7j3Fx5jQQ3LJcp2/nEL6Bw4QbghpzmZf4ISq5JqxxU2idrMJX8n+LTtOiGMui6w8KD50cE/dbEKkPZMqCi7adfHQOEiIsLKutmIbwwhmkFw==',key_name='tempest-keypair-566143322',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8b357cc820a04f3486f98d8e38c1a3d6',ramdisk_id='',reservation_id='r-mxkh2z6f',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2824808e-fd92-429e-ad00-18522a9ee7be',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_machine_type='pc',image_hw_scsi_model='virtio-scsi',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachSCSIVolumeTestJSON-344223138',owner_user_name='tempest-AttachSCSIVolumeTestJSON-344223138-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:14:37Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ffaa9df682cb40739d1d754000e04743',uuid=aaac3797-349f-4695-bea2-8b0c022a66e0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f2d5008c-284e-45a5-b349-4fe0723e138e", "address": "fa:16:3e:76:88:8e", "network": {"id": "a8157d06-a7f6-4b9c-ae66-cd40da31eb6d", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1365454803-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8b357cc820a04f3486f98d8e38c1a3d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf2d5008c-28", "ovs_interfaceid": "f2d5008c-284e-45a5-b349-4fe0723e138e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70975) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Converting VIF {"id": "f2d5008c-284e-45a5-b349-4fe0723e138e", "address": "fa:16:3e:76:88:8e", "network": {"id": "a8157d06-a7f6-4b9c-ae66-cd40da31eb6d", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1365454803-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8b357cc820a04f3486f98d8e38c1a3d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf2d5008c-28", "ovs_interfaceid": "f2d5008c-284e-45a5-b349-4fe0723e138e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:76:88:8e,bridge_name='br-int',has_traffic_filtering=True,id=f2d5008c-284e-45a5-b349-4fe0723e138e,network=Network(a8157d06-a7f6-4b9c-ae66-cd40da31eb6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf2d5008c-28') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG nova.objects.instance [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Lazy-loading 'pci_devices' on Instance uuid aaac3797-349f-4695-bea2-8b0c022a66e0 {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] End _get_guest_xml xml= Apr 18 16:14:43 user nova-compute[70975]: aaac3797-349f-4695-bea2-8b0c022a66e0 Apr 18 16:14:43 user nova-compute[70975]: instance-00000006 Apr 18 16:14:43 user nova-compute[70975]: 131072 Apr 18 16:14:43 user nova-compute[70975]: 1 Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: tempest-AttachSCSIVolumeTestJSON-server-182924015 Apr 18 16:14:43 user nova-compute[70975]: 2023-04-18 16:14:43 Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: 128 Apr 18 16:14:43 user nova-compute[70975]: 1 Apr 18 16:14:43 user nova-compute[70975]: 0 Apr 18 16:14:43 user nova-compute[70975]: 0 Apr 18 16:14:43 user nova-compute[70975]: 1 Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: tempest-AttachSCSIVolumeTestJSON-344223138-project-member Apr 18 16:14:43 user nova-compute[70975]: tempest-AttachSCSIVolumeTestJSON-344223138 Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: OpenStack Foundation Apr 18 16:14:43 user nova-compute[70975]: OpenStack Nova Apr 18 16:14:43 user nova-compute[70975]: 0.0.0 Apr 18 16:14:43 user nova-compute[70975]: aaac3797-349f-4695-bea2-8b0c022a66e0 Apr 18 16:14:43 user nova-compute[70975]: aaac3797-349f-4695-bea2-8b0c022a66e0 Apr 18 16:14:43 user nova-compute[70975]: Virtual Machine Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: hvm Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Nehalem Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]:
Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]:
Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: /dev/urandom Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: Apr 18 16:14:43 user nova-compute[70975]: {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2023-04-18T16:14:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-182924015',display_name='tempest-AttachSCSIVolumeTestJSON-server-182924015',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachscsivolumetestjson-server-182924015',id=6,image_ref='2824808e-fd92-429e-ad00-18522a9ee7be',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBONN6MRKRvMlP+yhMvoK61g7j3Fx5jQQ3LJcp2/nEL6Bw4QbghpzmZf4ISq5JqxxU2idrMJX8n+LTtOiGMui6w8KD50cE/dbEKkPZMqCi7adfHQOEiIsLKutmIbwwhmkFw==',key_name='tempest-keypair-566143322',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8b357cc820a04f3486f98d8e38c1a3d6',ramdisk_id='',reservation_id='r-mxkh2z6f',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2824808e-fd92-429e-ad00-18522a9ee7be',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_machine_type='pc',image_hw_scsi_model='virtio-scsi',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachSCSIVolumeTestJSON-344223138',owner_user_name='tempest-AttachSCSIVolumeTestJSON-344223138-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:14:37Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ffaa9df682cb40739d1d754000e04743',uuid=aaac3797-349f-4695-bea2-8b0c022a66e0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f2d5008c-284e-45a5-b349-4fe0723e138e", "address": "fa:16:3e:76:88:8e", "network": {"id": "a8157d06-a7f6-4b9c-ae66-cd40da31eb6d", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1365454803-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8b357cc820a04f3486f98d8e38c1a3d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf2d5008c-28", "ovs_interfaceid": "f2d5008c-284e-45a5-b349-4fe0723e138e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Converting VIF {"id": "f2d5008c-284e-45a5-b349-4fe0723e138e", "address": "fa:16:3e:76:88:8e", "network": {"id": "a8157d06-a7f6-4b9c-ae66-cd40da31eb6d", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1365454803-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8b357cc820a04f3486f98d8e38c1a3d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf2d5008c-28", "ovs_interfaceid": "f2d5008c-284e-45a5-b349-4fe0723e138e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:76:88:8e,bridge_name='br-int',has_traffic_filtering=True,id=f2d5008c-284e-45a5-b349-4fe0723e138e,network=Network(a8157d06-a7f6-4b9c-ae66-cd40da31eb6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf2d5008c-28') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG os_vif [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:76:88:8e,bridge_name='br-int',has_traffic_filtering=True,id=f2d5008c-284e-45a5-b349-4fe0723e138e,network=Network(a8157d06-a7f6-4b9c-ae66-cd40da31eb6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf2d5008c-28') {{(pid=70975) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf2d5008c-28, may_exist=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf2d5008c-28, col_values=(('external_ids', {'iface-id': 'f2d5008c-284e-45a5-b349-4fe0723e138e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:76:88:8e', 'vm-uuid': 'aaac3797-349f-4695-bea2-8b0c022a66e0'}),)) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:43 user nova-compute[70975]: INFO os_vif [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:76:88:8e,bridge_name='br-int',has_traffic_filtering=True,id=f2d5008c-284e-45a5-b349-4fe0723e138e,network=Network(a8157d06-a7f6-4b9c-ae66-cd40da31eb6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf2d5008c-28') Apr 18 16:14:43 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] No BDM found with device name sda, not building metadata. {{(pid=70975) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] No BDM found with device name sdb, not building metadata. {{(pid=70975) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] No VIF found with MAC fa:16:3e:76:88:8e, not building metadata {{(pid=70975) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 18 16:14:43 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Using config drive Apr 18 16:14:43 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:43 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Creating config drive at /opt/stack/data/nova/instances/aaac3797-349f-4695-bea2-8b0c022a66e0/disk.config Apr 18 16:14:43 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Running cmd (subprocess): genisoimage -o /opt/stack/data/nova/instances/aaac3797-349f-4695-bea2-8b0c022a66e0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 0.0.0 -quiet -J -r -V config-2 /tmp/tmpj3egxt98 {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] CMD "genisoimage -o /opt/stack/data/nova/instances/aaac3797-349f-4695-bea2-8b0c022a66e0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 0.0.0 -quiet -J -r -V config-2 /tmp/tmpj3egxt98" returned: 0 in 0.063s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG nova.network.neutron [req-ddd7be87-de64-434e-9ca1-c0a8b7735806 req-2627648f-205b-4f09-ab9a-f9df7c64b298 service nova] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Updated VIF entry in instance network info cache for port f2d5008c-284e-45a5-b349-4fe0723e138e. {{(pid=70975) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG nova.network.neutron [req-ddd7be87-de64-434e-9ca1-c0a8b7735806 req-2627648f-205b-4f09-ab9a-f9df7c64b298 service nova] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Updating instance_info_cache with network_info: [{"id": "f2d5008c-284e-45a5-b349-4fe0723e138e", "address": "fa:16:3e:76:88:8e", "network": {"id": "a8157d06-a7f6-4b9c-ae66-cd40da31eb6d", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1365454803-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8b357cc820a04f3486f98d8e38c1a3d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf2d5008c-28", "ovs_interfaceid": "f2d5008c-284e-45a5-b349-4fe0723e138e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:14:43 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-ddd7be87-de64-434e-9ca1-c0a8b7735806 req-2627648f-205b-4f09-ab9a-f9df7c64b298 service nova] Releasing lock "refresh_cache-aaac3797-349f-4695-bea2-8b0c022a66e0" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:14:44 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Acquiring lock "6c592508-0444-4b42-a0b5-e3d8bd97f5ba" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:44 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lock "6c592508-0444-4b42-a0b5-e3d8bd97f5ba" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:44 user nova-compute[70975]: DEBUG nova.compute.manager [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Starting instance... {{(pid=70975) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 18 16:14:44 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:44 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:44 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70975) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 18 16:14:44 user nova-compute[70975]: INFO nova.compute.claims [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Claim successful on node user Apr 18 16:14:44 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:14:44 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:14:44 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.425s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:44 user nova-compute[70975]: DEBUG nova.compute.manager [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Start building networks asynchronously for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 18 16:14:44 user nova-compute[70975]: DEBUG nova.compute.manager [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Allocating IP information in the background. {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 18 16:14:44 user nova-compute[70975]: DEBUG nova.network.neutron [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] allocate_for_instance() {{(pid=70975) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 18 16:14:44 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:45 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 18 16:14:45 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:45 user nova-compute[70975]: DEBUG nova.compute.manager [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Start building block device mappings for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 18 16:14:45 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:45 user nova-compute[70975]: DEBUG nova.compute.manager [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Start spawning the instance on the hypervisor. {{(pid=70975) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 18 16:14:45 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Creating instance directory {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 18 16:14:45 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Creating image(s) Apr 18 16:14:45 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Acquiring lock "/opt/stack/data/nova/instances/6c592508-0444-4b42-a0b5-e3d8bd97f5ba/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:45 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lock "/opt/stack/data/nova/instances/6c592508-0444-4b42-a0b5-e3d8bd97f5ba/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:45 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lock "/opt/stack/data/nova/instances/6c592508-0444-4b42-a0b5-e3d8bd97f5ba/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.004s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:45 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:45 user nova-compute[70975]: DEBUG nova.policy [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '299ba2e202244f59a09e22df9ea8efe7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8edf93a24e754e1ea58c0a7fd4f553dc', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70975) authorize /opt/stack/nova/nova/policy.py:203}} Apr 18 16:14:45 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.164s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:45 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Acquiring lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:45 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:45 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:45 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.165s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:45 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/6c592508-0444-4b42-a0b5-e3d8bd97f5ba/disk 1073741824 {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:45 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/6c592508-0444-4b42-a0b5-e3d8bd97f5ba/disk 1073741824" returned: 0 in 0.056s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:45 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.233s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:45 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:45 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:46 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:46 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.168s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:46 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Checking if we can resize image /opt/stack/data/nova/instances/6c592508-0444-4b42-a0b5-e3d8bd97f5ba/disk. size=1073741824 {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 18 16:14:46 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6c592508-0444-4b42-a0b5-e3d8bd97f5ba/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:46 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:46 user nova-compute[70975]: DEBUG nova.compute.manager [req-344e0525-b8a0-4866-a2db-72ba08cb376a req-b8d706da-c395-4dd1-bf95-988b42720e5b service nova] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Received event network-vif-plugged-f2d5008c-284e-45a5-b349-4fe0723e138e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:14:46 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-344e0525-b8a0-4866-a2db-72ba08cb376a req-b8d706da-c395-4dd1-bf95-988b42720e5b service nova] Acquiring lock "aaac3797-349f-4695-bea2-8b0c022a66e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:46 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-344e0525-b8a0-4866-a2db-72ba08cb376a req-b8d706da-c395-4dd1-bf95-988b42720e5b service nova] Lock "aaac3797-349f-4695-bea2-8b0c022a66e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:46 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-344e0525-b8a0-4866-a2db-72ba08cb376a req-b8d706da-c395-4dd1-bf95-988b42720e5b service nova] Lock "aaac3797-349f-4695-bea2-8b0c022a66e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:46 user nova-compute[70975]: DEBUG nova.compute.manager [req-344e0525-b8a0-4866-a2db-72ba08cb376a req-b8d706da-c395-4dd1-bf95-988b42720e5b service nova] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] No waiting events found dispatching network-vif-plugged-f2d5008c-284e-45a5-b349-4fe0723e138e {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:14:46 user nova-compute[70975]: WARNING nova.compute.manager [req-344e0525-b8a0-4866-a2db-72ba08cb376a req-b8d706da-c395-4dd1-bf95-988b42720e5b service nova] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Received unexpected event network-vif-plugged-f2d5008c-284e-45a5-b349-4fe0723e138e for instance with vm_state building and task_state spawning. Apr 18 16:14:46 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6c592508-0444-4b42-a0b5-e3d8bd97f5ba/disk --force-share --output=json" returned: 0 in 0.187s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:46 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Cannot resize image /opt/stack/data/nova/instances/6c592508-0444-4b42-a0b5-e3d8bd97f5ba/disk to a smaller size. {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 18 16:14:46 user nova-compute[70975]: DEBUG nova.objects.instance [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lazy-loading 'migration_context' on Instance uuid 6c592508-0444-4b42-a0b5-e3d8bd97f5ba {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:14:46 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Created local disks {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 18 16:14:46 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Ensure instance console log exists: /opt/stack/data/nova/instances/6c592508-0444-4b42-a0b5-e3d8bd97f5ba/console.log {{(pid=70975) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 18 16:14:46 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:46 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:46 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:47 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:47 user nova-compute[70975]: DEBUG nova.network.neutron [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Successfully created port: 395afd81-e898-47ee-a928-eaab584d5b4e {{(pid=70975) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 18 16:14:47 user nova-compute[70975]: DEBUG nova.compute.manager [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Instance event wait completed in 0 seconds for {{(pid=70975) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 18 16:14:47 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Guest created on hypervisor {{(pid=70975) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 18 16:14:47 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Resumed> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:14:47 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] VM Resumed (Lifecycle Event) Apr 18 16:14:47 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Instance spawned successfully. Apr 18 16:14:47 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Attempting to register defaults for the following image properties: ['hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 18 16:14:47 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Found default for hw_input_bus of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:47 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Found default for hw_pointer_model of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:47 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Found default for hw_video_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:47 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Found default for hw_vif_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:47 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:14:47 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:14:47 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:14:47 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Started> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:14:47 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] VM Started (Lifecycle Event) Apr 18 16:14:47 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:14:47 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:14:47 user nova-compute[70975]: INFO nova.compute.manager [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Took 10.12 seconds to spawn the instance on the hypervisor. Apr 18 16:14:47 user nova-compute[70975]: DEBUG nova.compute.manager [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:14:47 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:14:47 user nova-compute[70975]: INFO nova.compute.manager [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Took 11.16 seconds to build instance. Apr 18 16:14:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ab1e8281-d1dc-4121-bf74-0042d40b8e4f tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Lock "aaac3797-349f-4695-bea2-8b0c022a66e0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 11.284s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:48 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:48 user nova-compute[70975]: DEBUG nova.compute.manager [req-44fc8817-5dab-41e7-a9e5-caaac7eadf0d req-9a11c995-a4d0-4e7d-9dbd-a1dc85638fff service nova] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Received event network-vif-plugged-f2d5008c-284e-45a5-b349-4fe0723e138e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:14:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-44fc8817-5dab-41e7-a9e5-caaac7eadf0d req-9a11c995-a4d0-4e7d-9dbd-a1dc85638fff service nova] Acquiring lock "aaac3797-349f-4695-bea2-8b0c022a66e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-44fc8817-5dab-41e7-a9e5-caaac7eadf0d req-9a11c995-a4d0-4e7d-9dbd-a1dc85638fff service nova] Lock "aaac3797-349f-4695-bea2-8b0c022a66e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-44fc8817-5dab-41e7-a9e5-caaac7eadf0d req-9a11c995-a4d0-4e7d-9dbd-a1dc85638fff service nova] Lock "aaac3797-349f-4695-bea2-8b0c022a66e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:48 user nova-compute[70975]: DEBUG nova.compute.manager [req-44fc8817-5dab-41e7-a9e5-caaac7eadf0d req-9a11c995-a4d0-4e7d-9dbd-a1dc85638fff service nova] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] No waiting events found dispatching network-vif-plugged-f2d5008c-284e-45a5-b349-4fe0723e138e {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:14:48 user nova-compute[70975]: WARNING nova.compute.manager [req-44fc8817-5dab-41e7-a9e5-caaac7eadf0d req-9a11c995-a4d0-4e7d-9dbd-a1dc85638fff service nova] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Received unexpected event network-vif-plugged-f2d5008c-284e-45a5-b349-4fe0723e138e for instance with vm_state active and task_state None. Apr 18 16:14:49 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:49 user nova-compute[70975]: DEBUG nova.network.neutron [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Successfully updated port: 395afd81-e898-47ee-a928-eaab584d5b4e {{(pid=70975) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 18 16:14:49 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Acquiring lock "refresh_cache-6c592508-0444-4b42-a0b5-e3d8bd97f5ba" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:14:49 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Acquired lock "refresh_cache-6c592508-0444-4b42-a0b5-e3d8bd97f5ba" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:14:49 user nova-compute[70975]: DEBUG nova.network.neutron [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Building network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 18 16:14:49 user nova-compute[70975]: DEBUG nova.network.neutron [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Instance cache missing network info. {{(pid=70975) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG nova.compute.manager [req-3a6aed1e-aa97-402c-a4fd-281ac59d7b34 req-f8ce8192-a200-4464-bbf9-05da117ca2bd service nova] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Received event network-changed-395afd81-e898-47ee-a928-eaab584d5b4e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG nova.compute.manager [req-3a6aed1e-aa97-402c-a4fd-281ac59d7b34 req-f8ce8192-a200-4464-bbf9-05da117ca2bd service nova] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Refreshing instance network info cache due to event network-changed-395afd81-e898-47ee-a928-eaab584d5b4e. {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-3a6aed1e-aa97-402c-a4fd-281ac59d7b34 req-f8ce8192-a200-4464-bbf9-05da117ca2bd service nova] Acquiring lock "refresh_cache-6c592508-0444-4b42-a0b5-e3d8bd97f5ba" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG nova.network.neutron [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Updating instance_info_cache with network_info: [{"id": "395afd81-e898-47ee-a928-eaab584d5b4e", "address": "fa:16:3e:fa:1c:ad", "network": {"id": "0fafa4bf-5d8d-43f9-8cc3-ae6e7c58812c", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-891115046-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8edf93a24e754e1ea58c0a7fd4f553dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap395afd81-e8", "ovs_interfaceid": "395afd81-e898-47ee-a928-eaab584d5b4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Releasing lock "refresh_cache-6c592508-0444-4b42-a0b5-e3d8bd97f5ba" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG nova.compute.manager [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Instance network_info: |[{"id": "395afd81-e898-47ee-a928-eaab584d5b4e", "address": "fa:16:3e:fa:1c:ad", "network": {"id": "0fafa4bf-5d8d-43f9-8cc3-ae6e7c58812c", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-891115046-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8edf93a24e754e1ea58c0a7fd4f553dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap395afd81-e8", "ovs_interfaceid": "395afd81-e898-47ee-a928-eaab584d5b4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-3a6aed1e-aa97-402c-a4fd-281ac59d7b34 req-f8ce8192-a200-4464-bbf9-05da117ca2bd service nova] Acquired lock "refresh_cache-6c592508-0444-4b42-a0b5-e3d8bd97f5ba" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG nova.network.neutron [req-3a6aed1e-aa97-402c-a4fd-281ac59d7b34 req-f8ce8192-a200-4464-bbf9-05da117ca2bd service nova] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Refreshing network info cache for port 395afd81-e898-47ee-a928-eaab584d5b4e {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Start _get_guest_xml network_info=[{"id": "395afd81-e898-47ee-a928-eaab584d5b4e", "address": "fa:16:3e:fa:1c:ad", "network": {"id": "0fafa4bf-5d8d-43f9-8cc3-ae6e7c58812c", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-891115046-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8edf93a24e754e1ea58c0a7fd4f553dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap395afd81-e8", "ovs_interfaceid": "395afd81-e898-47ee-a928-eaab584d5b4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encrypted': False, 'device_type': 'disk', 'encryption_secret_uuid': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'b11a20de-f82a-4158-b53e-0a0c7a1552cb'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 18 16:14:50 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:14:50 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:14:50 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70975) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-18T16:11:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=), allow threads: True {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Flavor limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Image limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Flavor pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Image pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Got 1 possible topologies {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:14:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-370003702',display_name='tempest-VolumesAdminNegativeTest-server-370003702',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-370003702',id=7,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAniXSetAQE9Sn51zA8NpTX2dOiul2qACE7wlThUOvDLY/XUKayPw9h+boGYtqxwA3BNtZbXaC0adc4Uojp5kUY4JmnKz7unbT3y9taLOI+qBOXnUno++8x4d6lIizphZQ==',key_name='tempest-keypair-1220171208',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8edf93a24e754e1ea58c0a7fd4f553dc',ramdisk_id='',reservation_id='r-cxt60s0r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-2015888259',owner_user_name='tempest-VolumesAdminNegativeTest-2015888259-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:14:45Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='299ba2e202244f59a09e22df9ea8efe7',uuid=6c592508-0444-4b42-a0b5-e3d8bd97f5ba,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "395afd81-e898-47ee-a928-eaab584d5b4e", "address": "fa:16:3e:fa:1c:ad", "network": {"id": "0fafa4bf-5d8d-43f9-8cc3-ae6e7c58812c", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-891115046-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8edf93a24e754e1ea58c0a7fd4f553dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap395afd81-e8", "ovs_interfaceid": "395afd81-e898-47ee-a928-eaab584d5b4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70975) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Converting VIF {"id": "395afd81-e898-47ee-a928-eaab584d5b4e", "address": "fa:16:3e:fa:1c:ad", "network": {"id": "0fafa4bf-5d8d-43f9-8cc3-ae6e7c58812c", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-891115046-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8edf93a24e754e1ea58c0a7fd4f553dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap395afd81-e8", "ovs_interfaceid": "395afd81-e898-47ee-a928-eaab584d5b4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:1c:ad,bridge_name='br-int',has_traffic_filtering=True,id=395afd81-e898-47ee-a928-eaab584d5b4e,network=Network(0fafa4bf-5d8d-43f9-8cc3-ae6e7c58812c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap395afd81-e8') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG nova.objects.instance [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lazy-loading 'pci_devices' on Instance uuid 6c592508-0444-4b42-a0b5-e3d8bd97f5ba {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] End _get_guest_xml xml= Apr 18 16:14:50 user nova-compute[70975]: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba Apr 18 16:14:50 user nova-compute[70975]: instance-00000007 Apr 18 16:14:50 user nova-compute[70975]: 131072 Apr 18 16:14:50 user nova-compute[70975]: 1 Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: tempest-VolumesAdminNegativeTest-server-370003702 Apr 18 16:14:50 user nova-compute[70975]: 2023-04-18 16:14:50 Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: 128 Apr 18 16:14:50 user nova-compute[70975]: 1 Apr 18 16:14:50 user nova-compute[70975]: 0 Apr 18 16:14:50 user nova-compute[70975]: 0 Apr 18 16:14:50 user nova-compute[70975]: 1 Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: tempest-VolumesAdminNegativeTest-2015888259-project-member Apr 18 16:14:50 user nova-compute[70975]: tempest-VolumesAdminNegativeTest-2015888259 Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: OpenStack Foundation Apr 18 16:14:50 user nova-compute[70975]: OpenStack Nova Apr 18 16:14:50 user nova-compute[70975]: 0.0.0 Apr 18 16:14:50 user nova-compute[70975]: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba Apr 18 16:14:50 user nova-compute[70975]: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba Apr 18 16:14:50 user nova-compute[70975]: Virtual Machine Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: hvm Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Nehalem Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: /dev/urandom Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: Apr 18 16:14:50 user nova-compute[70975]: {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:14:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-370003702',display_name='tempest-VolumesAdminNegativeTest-server-370003702',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-370003702',id=7,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAniXSetAQE9Sn51zA8NpTX2dOiul2qACE7wlThUOvDLY/XUKayPw9h+boGYtqxwA3BNtZbXaC0adc4Uojp5kUY4JmnKz7unbT3y9taLOI+qBOXnUno++8x4d6lIizphZQ==',key_name='tempest-keypair-1220171208',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8edf93a24e754e1ea58c0a7fd4f553dc',ramdisk_id='',reservation_id='r-cxt60s0r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-2015888259',owner_user_name='tempest-VolumesAdminNegativeTest-2015888259-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:14:45Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='299ba2e202244f59a09e22df9ea8efe7',uuid=6c592508-0444-4b42-a0b5-e3d8bd97f5ba,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "395afd81-e898-47ee-a928-eaab584d5b4e", "address": "fa:16:3e:fa:1c:ad", "network": {"id": "0fafa4bf-5d8d-43f9-8cc3-ae6e7c58812c", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-891115046-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8edf93a24e754e1ea58c0a7fd4f553dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap395afd81-e8", "ovs_interfaceid": "395afd81-e898-47ee-a928-eaab584d5b4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Converting VIF {"id": "395afd81-e898-47ee-a928-eaab584d5b4e", "address": "fa:16:3e:fa:1c:ad", "network": {"id": "0fafa4bf-5d8d-43f9-8cc3-ae6e7c58812c", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-891115046-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8edf93a24e754e1ea58c0a7fd4f553dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap395afd81-e8", "ovs_interfaceid": "395afd81-e898-47ee-a928-eaab584d5b4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:1c:ad,bridge_name='br-int',has_traffic_filtering=True,id=395afd81-e898-47ee-a928-eaab584d5b4e,network=Network(0fafa4bf-5d8d-43f9-8cc3-ae6e7c58812c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap395afd81-e8') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG os_vif [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:1c:ad,bridge_name='br-int',has_traffic_filtering=True,id=395afd81-e898-47ee-a928-eaab584d5b4e,network=Network(0fafa4bf-5d8d-43f9-8cc3-ae6e7c58812c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap395afd81-e8') {{(pid=70975) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap395afd81-e8, may_exist=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap395afd81-e8, col_values=(('external_ids', {'iface-id': '395afd81-e898-47ee-a928-eaab584d5b4e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fa:1c:ad', 'vm-uuid': '6c592508-0444-4b42-a0b5-e3d8bd97f5ba'}),)) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:50 user nova-compute[70975]: INFO os_vif [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:1c:ad,bridge_name='br-int',has_traffic_filtering=True,id=395afd81-e898-47ee-a928-eaab584d5b4e,network=Network(0fafa4bf-5d8d-43f9-8cc3-ae6e7c58812c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap395afd81-e8') Apr 18 16:14:50 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] No BDM found with device name vda, not building metadata. {{(pid=70975) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 18 16:14:50 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] No VIF found with MAC fa:16:3e:fa:1c:ad, not building metadata {{(pid=70975) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 18 16:14:51 user nova-compute[70975]: DEBUG nova.network.neutron [req-3a6aed1e-aa97-402c-a4fd-281ac59d7b34 req-f8ce8192-a200-4464-bbf9-05da117ca2bd service nova] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Updated VIF entry in instance network info cache for port 395afd81-e898-47ee-a928-eaab584d5b4e. {{(pid=70975) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 18 16:14:51 user nova-compute[70975]: DEBUG nova.network.neutron [req-3a6aed1e-aa97-402c-a4fd-281ac59d7b34 req-f8ce8192-a200-4464-bbf9-05da117ca2bd service nova] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Updating instance_info_cache with network_info: [{"id": "395afd81-e898-47ee-a928-eaab584d5b4e", "address": "fa:16:3e:fa:1c:ad", "network": {"id": "0fafa4bf-5d8d-43f9-8cc3-ae6e7c58812c", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-891115046-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8edf93a24e754e1ea58c0a7fd4f553dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap395afd81-e8", "ovs_interfaceid": "395afd81-e898-47ee-a928-eaab584d5b4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:14:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-3a6aed1e-aa97-402c-a4fd-281ac59d7b34 req-f8ce8192-a200-4464-bbf9-05da117ca2bd service nova] Releasing lock "refresh_cache-6c592508-0444-4b42-a0b5-e3d8bd97f5ba" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:14:52 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:52 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:52 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:52 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:52 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:52 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:14:52 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Cleaning up deleted instances {{(pid=70975) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 18 16:14:52 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] There are 0 instances to clean {{(pid=70975) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 18 16:14:52 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:14:52 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Cleaning up deleted instances with incomplete migration {{(pid=70975) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 18 16:14:52 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:14:52 user nova-compute[70975]: DEBUG nova.compute.manager [req-6bc12126-972a-4fc3-b8c9-e054ea4fa96c req-ba645231-198b-4564-a2ad-41002e154b6d service nova] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Received event network-vif-plugged-395afd81-e898-47ee-a928-eaab584d5b4e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:14:52 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-6bc12126-972a-4fc3-b8c9-e054ea4fa96c req-ba645231-198b-4564-a2ad-41002e154b6d service nova] Acquiring lock "6c592508-0444-4b42-a0b5-e3d8bd97f5ba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:52 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-6bc12126-972a-4fc3-b8c9-e054ea4fa96c req-ba645231-198b-4564-a2ad-41002e154b6d service nova] Lock "6c592508-0444-4b42-a0b5-e3d8bd97f5ba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:52 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-6bc12126-972a-4fc3-b8c9-e054ea4fa96c req-ba645231-198b-4564-a2ad-41002e154b6d service nova] Lock "6c592508-0444-4b42-a0b5-e3d8bd97f5ba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:52 user nova-compute[70975]: DEBUG nova.compute.manager [req-6bc12126-972a-4fc3-b8c9-e054ea4fa96c req-ba645231-198b-4564-a2ad-41002e154b6d service nova] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] No waiting events found dispatching network-vif-plugged-395afd81-e898-47ee-a928-eaab584d5b4e {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:14:52 user nova-compute[70975]: WARNING nova.compute.manager [req-6bc12126-972a-4fc3-b8c9-e054ea4fa96c req-ba645231-198b-4564-a2ad-41002e154b6d service nova] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Received unexpected event network-vif-plugged-395afd81-e898-47ee-a928-eaab584d5b4e for instance with vm_state building and task_state spawning. Apr 18 16:14:53 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:53 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:53 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:54 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Resumed> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:14:54 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] VM Resumed (Lifecycle Event) Apr 18 16:14:54 user nova-compute[70975]: DEBUG nova.compute.manager [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Instance event wait completed in 0 seconds for {{(pid=70975) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 18 16:14:54 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Guest created on hypervisor {{(pid=70975) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 18 16:14:54 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Instance spawned successfully. Apr 18 16:14:54 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 18 16:14:54 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:14:54 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:14:54 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Found default for hw_cdrom_bus of ide {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:54 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Found default for hw_disk_bus of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:54 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Found default for hw_input_bus of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:54 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Found default for hw_pointer_model of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:54 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Found default for hw_video_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:54 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Found default for hw_vif_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:14:54 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:14:54 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Started> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:14:54 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] VM Started (Lifecycle Event) Apr 18 16:14:54 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:14:54 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:14:54 user nova-compute[70975]: DEBUG nova.compute.manager [req-ea92ccec-fae2-4941-86e3-907f123e8968 req-ee9ed0f5-b806-4cc7-9344-8cc20f7f6f29 service nova] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Received event network-vif-plugged-395afd81-e898-47ee-a928-eaab584d5b4e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:14:54 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-ea92ccec-fae2-4941-86e3-907f123e8968 req-ee9ed0f5-b806-4cc7-9344-8cc20f7f6f29 service nova] Acquiring lock "6c592508-0444-4b42-a0b5-e3d8bd97f5ba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:54 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-ea92ccec-fae2-4941-86e3-907f123e8968 req-ee9ed0f5-b806-4cc7-9344-8cc20f7f6f29 service nova] Lock "6c592508-0444-4b42-a0b5-e3d8bd97f5ba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:54 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-ea92ccec-fae2-4941-86e3-907f123e8968 req-ee9ed0f5-b806-4cc7-9344-8cc20f7f6f29 service nova] Lock "6c592508-0444-4b42-a0b5-e3d8bd97f5ba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:54 user nova-compute[70975]: DEBUG nova.compute.manager [req-ea92ccec-fae2-4941-86e3-907f123e8968 req-ee9ed0f5-b806-4cc7-9344-8cc20f7f6f29 service nova] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] No waiting events found dispatching network-vif-plugged-395afd81-e898-47ee-a928-eaab584d5b4e {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:14:54 user nova-compute[70975]: WARNING nova.compute.manager [req-ea92ccec-fae2-4941-86e3-907f123e8968 req-ee9ed0f5-b806-4cc7-9344-8cc20f7f6f29 service nova] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Received unexpected event network-vif-plugged-395afd81-e898-47ee-a928-eaab584d5b4e for instance with vm_state building and task_state spawning. Apr 18 16:14:54 user nova-compute[70975]: INFO nova.compute.manager [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Took 9.21 seconds to spawn the instance on the hypervisor. Apr 18 16:14:54 user nova-compute[70975]: DEBUG nova.compute.manager [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:14:54 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:14:54 user nova-compute[70975]: INFO nova.compute.manager [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Took 10.40 seconds to build instance. Apr 18 16:14:54 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-9575af77-9ead-423f-a441-ea9893811782 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lock "6c592508-0444-4b42-a0b5-e3d8bd97f5ba" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.516s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:55 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:14:55 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:55 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:56 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:14:56 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:14:56 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:14:56 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:14:56 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70975) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 18 16:14:57 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:57 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:14:57 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Starting heal instance info cache {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 18 16:14:57 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Rebuilding the list of instances to heal {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 18 16:14:57 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "refresh_cache-b9feb20a-78c0-44ac-ab87-3a68a14396aa" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:14:57 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquired lock "refresh_cache-b9feb20a-78c0-44ac-ab87-3a68a14396aa" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:14:57 user nova-compute[70975]: DEBUG nova.network.neutron [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Forcefully refreshing network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 18 16:14:57 user nova-compute[70975]: DEBUG nova.objects.instance [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lazy-loading 'info_cache' on Instance uuid b9feb20a-78c0-44ac-ab87-3a68a14396aa {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:14:57 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:14:58 user nova-compute[70975]: DEBUG nova.network.neutron [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Updating instance_info_cache with network_info: [{"id": "cba845fa-9bbc-4e86-9fc9-f9458343fcc9", "address": "fa:16:3e:db:e1:db", "network": {"id": "16a8b366-68dd-415f-bae3-c01a7603f384", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1737580312-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "eb907be282bb4348976527807993ee58", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcba845fa-9b", "ovs_interfaceid": "cba845fa-9bbc-4e86-9fc9-f9458343fcc9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:14:58 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Releasing lock "refresh_cache-b9feb20a-78c0-44ac-ab87-3a68a14396aa" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:14:58 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Updated the network info_cache for instance {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 18 16:14:58 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:14:58 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:14:58 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager.update_available_resource {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:14:58 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:58 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:58 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:14:58 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Auditing locally available compute resources for user (node: user) {{(pid=70975) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 18 16:14:58 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:58 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk --force-share --output=json" returned: 0 in 0.161s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:58 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:59 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk --force-share --output=json" returned: 0 in 0.155s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:59 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/993d062c-8462-4534-bcde-9249779d4e90/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:59 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/993d062c-8462-4534-bcde-9249779d4e90/disk --force-share --output=json" returned: 0 in 0.172s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:59 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/993d062c-8462-4534-bcde-9249779d4e90/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:59 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/993d062c-8462-4534-bcde-9249779d4e90/disk --force-share --output=json" returned: 0 in 0.159s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:59 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:59 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk --force-share --output=json" returned: 0 in 0.191s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:59 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:59 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Acquiring lock "8e1ccfc5-90a7-443f-83e2-c07be27d6c7c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:59 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lock "8e1ccfc5-90a7-443f-83e2-c07be27d6c7c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:59 user nova-compute[70975]: DEBUG nova.compute.manager [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Starting instance... {{(pid=70975) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 18 16:14:59 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk --force-share --output=json" returned: 0 in 0.169s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:59 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/da82d905-1ca1-403d-9598-7561e69b9704/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:14:59 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:14:59 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:14:59 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70975) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 18 16:14:59 user nova-compute[70975]: INFO nova.compute.claims [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Claim successful on node user Apr 18 16:14:59 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/da82d905-1ca1-403d-9598-7561e69b9704/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:14:59 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/da82d905-1ca1-403d-9598-7561e69b9704/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:15:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/da82d905-1ca1-403d-9598-7561e69b9704/disk --force-share --output=json" returned: 0 in 0.148s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:15:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/aaac3797-349f-4695-bea2-8b0c022a66e0/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:15:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/aaac3797-349f-4695-bea2-8b0c022a66e0/disk --force-share --output=json" returned: 0 in 0.150s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:15:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/aaac3797-349f-4695-bea2-8b0c022a66e0/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:15:00 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:15:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/aaac3797-349f-4695-bea2-8b0c022a66e0/disk --force-share --output=json" returned: 0 in 0.156s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:15:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b9feb20a-78c0-44ac-ab87-3a68a14396aa/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:15:00 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:15:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.642s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:15:00 user nova-compute[70975]: DEBUG nova.compute.manager [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Start building networks asynchronously for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 18 16:15:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b9feb20a-78c0-44ac-ab87-3a68a14396aa/disk --force-share --output=json" returned: 0 in 0.168s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:15:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b9feb20a-78c0-44ac-ab87-3a68a14396aa/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:15:00 user nova-compute[70975]: DEBUG nova.compute.manager [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Allocating IP information in the background. {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 18 16:15:00 user nova-compute[70975]: DEBUG nova.network.neutron [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] allocate_for_instance() {{(pid=70975) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 18 16:15:00 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 18 16:15:00 user nova-compute[70975]: DEBUG nova.compute.manager [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Start building block device mappings for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 18 16:15:00 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b9feb20a-78c0-44ac-ab87-3a68a14396aa/disk --force-share --output=json" returned: 0 in 0.194s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:15:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6c592508-0444-4b42-a0b5-e3d8bd97f5ba/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:15:00 user nova-compute[70975]: DEBUG nova.compute.manager [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Start spawning the instance on the hypervisor. {{(pid=70975) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 18 16:15:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Creating instance directory {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 18 16:15:00 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Creating image(s) Apr 18 16:15:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Acquiring lock "/opt/stack/data/nova/instances/8e1ccfc5-90a7-443f-83e2-c07be27d6c7c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:15:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lock "/opt/stack/data/nova/instances/8e1ccfc5-90a7-443f-83e2-c07be27d6c7c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:15:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lock "/opt/stack/data/nova/instances/8e1ccfc5-90a7-443f-83e2-c07be27d6c7c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:15:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:15:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6c592508-0444-4b42-a0b5-e3d8bd97f5ba/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:15:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6c592508-0444-4b42-a0b5-e3d8bd97f5ba/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:15:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.151s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:15:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Acquiring lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:15:01 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:15:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:15:01 user nova-compute[70975]: DEBUG nova.policy [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fd46686fd5b845cca0f3d9452a86f4ca', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd82a93c1cb9b4a4da7114874ddf0aa27', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70975) authorize /opt/stack/nova/nova/policy.py:203}} Apr 18 16:15:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6c592508-0444-4b42-a0b5-e3d8bd97f5ba/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:15:01 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.638s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:15:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/8e1ccfc5-90a7-443f-83e2-c07be27d6c7c/disk 1073741824 {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:15:01 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:15:01 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:15:01 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Hypervisor/Node resource view: name=user free_ram=8096MB free_disk=26.654598236083984GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70975) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 18 16:15:01 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:15:01 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:15:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/8e1ccfc5-90a7-443f-83e2-c07be27d6c7c/disk 1073741824" returned: 0 in 0.057s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:15:01 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.698s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:15:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:15:01 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.139s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:15:01 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Checking if we can resize image /opt/stack/data/nova/instances/8e1ccfc5-90a7-443f-83e2-c07be27d6c7c/disk. size=1073741824 {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 18 16:15:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8e1ccfc5-90a7-443f-83e2-c07be27d6c7c/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:15:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8e1ccfc5-90a7-443f-83e2-c07be27d6c7c/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:15:01 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Cannot resize image /opt/stack/data/nova/instances/8e1ccfc5-90a7-443f-83e2-c07be27d6c7c/disk to a smaller size. {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 18 16:15:01 user nova-compute[70975]: DEBUG nova.objects.instance [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lazy-loading 'migration_context' on Instance uuid 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:15:02 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Created local disks {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 18 16:15:02 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Ensure instance console log exists: /opt/stack/data/nova/instances/8e1ccfc5-90a7-443f-83e2-c07be27d6c7c/console.log {{(pid=70975) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 18 16:15:02 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:15:02 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:15:02 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:15:02 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance d7a293bf-a9bd-424e-ba11-bbed7dfea41c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:15:02 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance da82d905-1ca1-403d-9598-7561e69b9704 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:15:02 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 993d062c-8462-4534-bcde-9249779d4e90 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:15:02 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance aaac3797-349f-4695-bea2-8b0c022a66e0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:15:02 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 6c592508-0444-4b42-a0b5-e3d8bd97f5ba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:15:02 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 1b530349-680e-4def-86ef-29c340efa175 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:15:02 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:15:02 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance b9feb20a-78c0-44ac-ab87-3a68a14396aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:15:02 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Total usable vcpus: 12, total allocated vcpus: 8 {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 18 16:15:02 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Final resource view: name=user phys_ram=16023MB used_ram=1536MB phys_disk=40GB used_disk=8GB total_vcpus=12 used_vcpus=8 pci_stats=[] {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 18 16:15:02 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Refreshing inventories for resource provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 18 16:15:02 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Updating ProviderTree inventory for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 18 16:15:02 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Updating inventory in ProviderTree for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 18 16:15:02 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Refreshing aggregate associations for resource provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9, aggregates: None {{(pid=70975) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 18 16:15:02 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Refreshing trait associations for resource provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9, traits: COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE41,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_RESCUE_BFV,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_IDE,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE42 {{(pid=70975) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 18 16:15:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:02 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:15:02 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:15:02 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Compute_service record updated for user:user {{(pid=70975) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 18 16:15:02 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.814s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:15:02 user nova-compute[70975]: DEBUG nova.network.neutron [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Successfully created port: 13606f1d-602f-4c77-b90b-32322653e54e {{(pid=70975) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG nova.network.neutron [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Successfully updated port: 13606f1d-602f-4c77-b90b-32322653e54e {{(pid=70975) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Acquiring lock "refresh_cache-8e1ccfc5-90a7-443f-83e2-c07be27d6c7c" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Acquired lock "refresh_cache-8e1ccfc5-90a7-443f-83e2-c07be27d6c7c" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG nova.network.neutron [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Building network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG nova.compute.manager [req-8c63780b-ab90-4826-ad9a-079134bda1fc req-bbba338a-995d-4a17-9c07-bc65bf6bbaf8 service nova] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Received event network-changed-13606f1d-602f-4c77-b90b-32322653e54e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG nova.compute.manager [req-8c63780b-ab90-4826-ad9a-079134bda1fc req-bbba338a-995d-4a17-9c07-bc65bf6bbaf8 service nova] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Refreshing instance network info cache due to event network-changed-13606f1d-602f-4c77-b90b-32322653e54e. {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-8c63780b-ab90-4826-ad9a-079134bda1fc req-bbba338a-995d-4a17-9c07-bc65bf6bbaf8 service nova] Acquiring lock "refresh_cache-8e1ccfc5-90a7-443f-83e2-c07be27d6c7c" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG nova.network.neutron [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Instance cache missing network info. {{(pid=70975) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG nova.network.neutron [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Updating instance_info_cache with network_info: [{"id": "13606f1d-602f-4c77-b90b-32322653e54e", "address": "fa:16:3e:dc:c9:32", "network": {"id": "7f49a051-667b-4e91-80de-f4bbf2d6f09e", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-316224389-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d82a93c1cb9b4a4da7114874ddf0aa27", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap13606f1d-60", "ovs_interfaceid": "13606f1d-602f-4c77-b90b-32322653e54e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Releasing lock "refresh_cache-8e1ccfc5-90a7-443f-83e2-c07be27d6c7c" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG nova.compute.manager [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Instance network_info: |[{"id": "13606f1d-602f-4c77-b90b-32322653e54e", "address": "fa:16:3e:dc:c9:32", "network": {"id": "7f49a051-667b-4e91-80de-f4bbf2d6f09e", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-316224389-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d82a93c1cb9b4a4da7114874ddf0aa27", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap13606f1d-60", "ovs_interfaceid": "13606f1d-602f-4c77-b90b-32322653e54e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-8c63780b-ab90-4826-ad9a-079134bda1fc req-bbba338a-995d-4a17-9c07-bc65bf6bbaf8 service nova] Acquired lock "refresh_cache-8e1ccfc5-90a7-443f-83e2-c07be27d6c7c" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG nova.network.neutron [req-8c63780b-ab90-4826-ad9a-079134bda1fc req-bbba338a-995d-4a17-9c07-bc65bf6bbaf8 service nova] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Refreshing network info cache for port 13606f1d-602f-4c77-b90b-32322653e54e {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Start _get_guest_xml network_info=[{"id": "13606f1d-602f-4c77-b90b-32322653e54e", "address": "fa:16:3e:dc:c9:32", "network": {"id": "7f49a051-667b-4e91-80de-f4bbf2d6f09e", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-316224389-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d82a93c1cb9b4a4da7114874ddf0aa27", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap13606f1d-60", "ovs_interfaceid": "13606f1d-602f-4c77-b90b-32322653e54e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encrypted': False, 'device_type': 'disk', 'encryption_secret_uuid': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'b11a20de-f82a-4158-b53e-0a0c7a1552cb'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 18 16:15:04 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:15:04 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:15:04 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70975) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-18T16:11:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=), allow threads: True {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Flavor limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Image limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Flavor pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Image pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Got 1 possible topologies {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:14:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-1897500495',display_name='tempest-AttachVolumeTestJSON-server-1897500495',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-1897500495',id=8,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLNl9658QV9oAyWsF+PCDoKNB8f2Ysl88swP+0slbqtbCbBmKteLMBpQfjt+1JvV5krJu0v93BvOWlct8ODb6udN1fTuqEBomWxKiKxQ9Jd2pVu6lIa5zb/YgKK7JjSPmQ==',key_name='tempest-keypair-1312731460',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d82a93c1cb9b4a4da7114874ddf0aa27',ramdisk_id='',reservation_id='r-wiop3oaa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-313351389',owner_user_name='tempest-AttachVolumeTestJSON-313351389-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:15:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fd46686fd5b845cca0f3d9452a86f4ca',uuid=8e1ccfc5-90a7-443f-83e2-c07be27d6c7c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "13606f1d-602f-4c77-b90b-32322653e54e", "address": "fa:16:3e:dc:c9:32", "network": {"id": "7f49a051-667b-4e91-80de-f4bbf2d6f09e", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-316224389-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d82a93c1cb9b4a4da7114874ddf0aa27", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap13606f1d-60", "ovs_interfaceid": "13606f1d-602f-4c77-b90b-32322653e54e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70975) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Converting VIF {"id": "13606f1d-602f-4c77-b90b-32322653e54e", "address": "fa:16:3e:dc:c9:32", "network": {"id": "7f49a051-667b-4e91-80de-f4bbf2d6f09e", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-316224389-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d82a93c1cb9b4a4da7114874ddf0aa27", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap13606f1d-60", "ovs_interfaceid": "13606f1d-602f-4c77-b90b-32322653e54e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dc:c9:32,bridge_name='br-int',has_traffic_filtering=True,id=13606f1d-602f-4c77-b90b-32322653e54e,network=Network(7f49a051-667b-4e91-80de-f4bbf2d6f09e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13606f1d-60') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG nova.objects.instance [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lazy-loading 'pci_devices' on Instance uuid 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] End _get_guest_xml xml= Apr 18 16:15:04 user nova-compute[70975]: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c Apr 18 16:15:04 user nova-compute[70975]: instance-00000008 Apr 18 16:15:04 user nova-compute[70975]: 131072 Apr 18 16:15:04 user nova-compute[70975]: 1 Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: tempest-AttachVolumeTestJSON-server-1897500495 Apr 18 16:15:04 user nova-compute[70975]: 2023-04-18 16:15:04 Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: 128 Apr 18 16:15:04 user nova-compute[70975]: 1 Apr 18 16:15:04 user nova-compute[70975]: 0 Apr 18 16:15:04 user nova-compute[70975]: 0 Apr 18 16:15:04 user nova-compute[70975]: 1 Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: tempest-AttachVolumeTestJSON-313351389-project-member Apr 18 16:15:04 user nova-compute[70975]: tempest-AttachVolumeTestJSON-313351389 Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: OpenStack Foundation Apr 18 16:15:04 user nova-compute[70975]: OpenStack Nova Apr 18 16:15:04 user nova-compute[70975]: 0.0.0 Apr 18 16:15:04 user nova-compute[70975]: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c Apr 18 16:15:04 user nova-compute[70975]: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c Apr 18 16:15:04 user nova-compute[70975]: Virtual Machine Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: hvm Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Nehalem Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: /dev/urandom Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: Apr 18 16:15:04 user nova-compute[70975]: {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:14:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-1897500495',display_name='tempest-AttachVolumeTestJSON-server-1897500495',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-1897500495',id=8,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLNl9658QV9oAyWsF+PCDoKNB8f2Ysl88swP+0slbqtbCbBmKteLMBpQfjt+1JvV5krJu0v93BvOWlct8ODb6udN1fTuqEBomWxKiKxQ9Jd2pVu6lIa5zb/YgKK7JjSPmQ==',key_name='tempest-keypair-1312731460',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d82a93c1cb9b4a4da7114874ddf0aa27',ramdisk_id='',reservation_id='r-wiop3oaa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-313351389',owner_user_name='tempest-AttachVolumeTestJSON-313351389-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:15:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fd46686fd5b845cca0f3d9452a86f4ca',uuid=8e1ccfc5-90a7-443f-83e2-c07be27d6c7c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "13606f1d-602f-4c77-b90b-32322653e54e", "address": "fa:16:3e:dc:c9:32", "network": {"id": "7f49a051-667b-4e91-80de-f4bbf2d6f09e", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-316224389-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d82a93c1cb9b4a4da7114874ddf0aa27", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap13606f1d-60", "ovs_interfaceid": "13606f1d-602f-4c77-b90b-32322653e54e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Converting VIF {"id": "13606f1d-602f-4c77-b90b-32322653e54e", "address": "fa:16:3e:dc:c9:32", "network": {"id": "7f49a051-667b-4e91-80de-f4bbf2d6f09e", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-316224389-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d82a93c1cb9b4a4da7114874ddf0aa27", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap13606f1d-60", "ovs_interfaceid": "13606f1d-602f-4c77-b90b-32322653e54e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dc:c9:32,bridge_name='br-int',has_traffic_filtering=True,id=13606f1d-602f-4c77-b90b-32322653e54e,network=Network(7f49a051-667b-4e91-80de-f4bbf2d6f09e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13606f1d-60') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG os_vif [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:dc:c9:32,bridge_name='br-int',has_traffic_filtering=True,id=13606f1d-602f-4c77-b90b-32322653e54e,network=Network(7f49a051-667b-4e91-80de-f4bbf2d6f09e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13606f1d-60') {{(pid=70975) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap13606f1d-60, may_exist=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap13606f1d-60, col_values=(('external_ids', {'iface-id': '13606f1d-602f-4c77-b90b-32322653e54e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:dc:c9:32', 'vm-uuid': '8e1ccfc5-90a7-443f-83e2-c07be27d6c7c'}),)) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:04 user nova-compute[70975]: INFO os_vif [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:dc:c9:32,bridge_name='br-int',has_traffic_filtering=True,id=13606f1d-602f-4c77-b90b-32322653e54e,network=Network(7f49a051-667b-4e91-80de-f4bbf2d6f09e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13606f1d-60') Apr 18 16:15:04 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] No BDM found with device name vda, not building metadata. {{(pid=70975) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 18 16:15:04 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] No VIF found with MAC fa:16:3e:dc:c9:32, not building metadata {{(pid=70975) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 18 16:15:05 user nova-compute[70975]: DEBUG nova.network.neutron [req-8c63780b-ab90-4826-ad9a-079134bda1fc req-bbba338a-995d-4a17-9c07-bc65bf6bbaf8 service nova] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Updated VIF entry in instance network info cache for port 13606f1d-602f-4c77-b90b-32322653e54e. {{(pid=70975) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 18 16:15:05 user nova-compute[70975]: DEBUG nova.network.neutron [req-8c63780b-ab90-4826-ad9a-079134bda1fc req-bbba338a-995d-4a17-9c07-bc65bf6bbaf8 service nova] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Updating instance_info_cache with network_info: [{"id": "13606f1d-602f-4c77-b90b-32322653e54e", "address": "fa:16:3e:dc:c9:32", "network": {"id": "7f49a051-667b-4e91-80de-f4bbf2d6f09e", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-316224389-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d82a93c1cb9b4a4da7114874ddf0aa27", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap13606f1d-60", "ovs_interfaceid": "13606f1d-602f-4c77-b90b-32322653e54e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:15:05 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-8c63780b-ab90-4826-ad9a-079134bda1fc req-bbba338a-995d-4a17-9c07-bc65bf6bbaf8 service nova] Releasing lock "refresh_cache-8e1ccfc5-90a7-443f-83e2-c07be27d6c7c" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:15:05 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Acquiring lock "8aaa4e97-9439-4760-9e05-8b248b02074f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:15:05 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Lock "8aaa4e97-9439-4760-9e05-8b248b02074f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:15:05 user nova-compute[70975]: DEBUG nova.compute.manager [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Starting instance... {{(pid=70975) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 18 16:15:05 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:15:05 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:15:05 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70975) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 18 16:15:05 user nova-compute[70975]: INFO nova.compute.claims [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Claim successful on node user Apr 18 16:15:05 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:15:05 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:15:05 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.428s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:15:05 user nova-compute[70975]: DEBUG nova.compute.manager [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Start building networks asynchronously for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 18 16:15:05 user nova-compute[70975]: DEBUG nova.compute.manager [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Allocating IP information in the background. {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 18 16:15:05 user nova-compute[70975]: DEBUG nova.network.neutron [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] allocate_for_instance() {{(pid=70975) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 18 16:15:05 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 18 16:15:05 user nova-compute[70975]: DEBUG nova.compute.manager [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Start building block device mappings for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 18 16:15:05 user nova-compute[70975]: DEBUG nova.policy [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6a284b1ad50e463894f8d58d38a57d7c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e6fc24a9e1b646a2a08df4f53f712267', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70975) authorize /opt/stack/nova/nova/policy.py:203}} Apr 18 16:15:05 user nova-compute[70975]: DEBUG nova.compute.manager [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Start spawning the instance on the hypervisor. {{(pid=70975) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 18 16:15:05 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Creating instance directory {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 18 16:15:05 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Creating image(s) Apr 18 16:15:05 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Acquiring lock "/opt/stack/data/nova/instances/8aaa4e97-9439-4760-9e05-8b248b02074f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:15:05 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Lock "/opt/stack/data/nova/instances/8aaa4e97-9439-4760-9e05-8b248b02074f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:15:05 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Lock "/opt/stack/data/nova/instances/8aaa4e97-9439-4760-9e05-8b248b02074f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:15:05 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:15:06 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:06 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:06 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:06 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:06 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.178s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:15:06 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Acquiring lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:15:06 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:15:06 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:15:06 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.139s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:15:06 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/8aaa4e97-9439-4760-9e05-8b248b02074f/disk 1073741824 {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:15:06 user nova-compute[70975]: DEBUG nova.compute.manager [req-1d14f48a-e8f0-41f9-9395-5385d1d2217f req-82b49ad2-e755-46ad-9131-245ccb89288c service nova] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Received event network-vif-plugged-13606f1d-602f-4c77-b90b-32322653e54e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:15:06 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-1d14f48a-e8f0-41f9-9395-5385d1d2217f req-82b49ad2-e755-46ad-9131-245ccb89288c service nova] Acquiring lock "8e1ccfc5-90a7-443f-83e2-c07be27d6c7c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:15:06 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-1d14f48a-e8f0-41f9-9395-5385d1d2217f req-82b49ad2-e755-46ad-9131-245ccb89288c service nova] Lock "8e1ccfc5-90a7-443f-83e2-c07be27d6c7c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.002s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:15:06 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-1d14f48a-e8f0-41f9-9395-5385d1d2217f req-82b49ad2-e755-46ad-9131-245ccb89288c service nova] Lock "8e1ccfc5-90a7-443f-83e2-c07be27d6c7c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:15:06 user nova-compute[70975]: DEBUG nova.compute.manager [req-1d14f48a-e8f0-41f9-9395-5385d1d2217f req-82b49ad2-e755-46ad-9131-245ccb89288c service nova] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] No waiting events found dispatching network-vif-plugged-13606f1d-602f-4c77-b90b-32322653e54e {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:15:06 user nova-compute[70975]: WARNING nova.compute.manager [req-1d14f48a-e8f0-41f9-9395-5385d1d2217f req-82b49ad2-e755-46ad-9131-245ccb89288c service nova] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Received unexpected event network-vif-plugged-13606f1d-602f-4c77-b90b-32322653e54e for instance with vm_state building and task_state spawning. Apr 18 16:15:06 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/8aaa4e97-9439-4760-9e05-8b248b02074f/disk 1073741824" returned: 0 in 0.085s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:15:06 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.238s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:15:06 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:15:06 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.133s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:15:06 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Checking if we can resize image /opt/stack/data/nova/instances/8aaa4e97-9439-4760-9e05-8b248b02074f/disk. size=1073741824 {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 18 16:15:06 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8aaa4e97-9439-4760-9e05-8b248b02074f/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:15:06 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8aaa4e97-9439-4760-9e05-8b248b02074f/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:15:06 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Cannot resize image /opt/stack/data/nova/instances/8aaa4e97-9439-4760-9e05-8b248b02074f/disk to a smaller size. {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 18 16:15:06 user nova-compute[70975]: DEBUG nova.objects.instance [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Lazy-loading 'migration_context' on Instance uuid 8aaa4e97-9439-4760-9e05-8b248b02074f {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:15:06 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Created local disks {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 18 16:15:06 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Ensure instance console log exists: /opt/stack/data/nova/instances/8aaa4e97-9439-4760-9e05-8b248b02074f/console.log {{(pid=70975) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 18 16:15:06 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:15:06 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:15:06 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:15:06 user nova-compute[70975]: DEBUG nova.network.neutron [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Successfully created port: 8029e455-c16d-48cd-93e1-cf56c226cc4a {{(pid=70975) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 18 16:15:06 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:06 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:06 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:07 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:07 user nova-compute[70975]: DEBUG nova.network.neutron [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Successfully updated port: 8029e455-c16d-48cd-93e1-cf56c226cc4a {{(pid=70975) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 18 16:15:07 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Acquiring lock "refresh_cache-8aaa4e97-9439-4760-9e05-8b248b02074f" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:15:07 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Acquired lock "refresh_cache-8aaa4e97-9439-4760-9e05-8b248b02074f" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:15:07 user nova-compute[70975]: DEBUG nova.network.neutron [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Building network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 18 16:15:07 user nova-compute[70975]: DEBUG nova.compute.manager [req-e769d872-bd6d-48a1-ae5e-f0493a7bc3e6 req-6b7a0d93-74a2-4e15-abbc-6b023bd42c68 service nova] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Received event network-changed-8029e455-c16d-48cd-93e1-cf56c226cc4a {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:15:07 user nova-compute[70975]: DEBUG nova.compute.manager [req-e769d872-bd6d-48a1-ae5e-f0493a7bc3e6 req-6b7a0d93-74a2-4e15-abbc-6b023bd42c68 service nova] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Refreshing instance network info cache due to event network-changed-8029e455-c16d-48cd-93e1-cf56c226cc4a. {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 18 16:15:07 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-e769d872-bd6d-48a1-ae5e-f0493a7bc3e6 req-6b7a0d93-74a2-4e15-abbc-6b023bd42c68 service nova] Acquiring lock "refresh_cache-8aaa4e97-9439-4760-9e05-8b248b02074f" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:15:07 user nova-compute[70975]: DEBUG nova.network.neutron [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Instance cache missing network info. {{(pid=70975) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 18 16:15:07 user nova-compute[70975]: DEBUG nova.network.neutron [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Updating instance_info_cache with network_info: [{"id": "8029e455-c16d-48cd-93e1-cf56c226cc4a", "address": "fa:16:3e:38:a4:82", "network": {"id": "7692c2b5-931d-4d1d-aae6-384ce4ff5ff0", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-144924554-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e6fc24a9e1b646a2a08df4f53f712267", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8029e455-c1", "ovs_interfaceid": "8029e455-c16d-48cd-93e1-cf56c226cc4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:15:07 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Releasing lock "refresh_cache-8aaa4e97-9439-4760-9e05-8b248b02074f" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:15:07 user nova-compute[70975]: DEBUG nova.compute.manager [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Instance network_info: |[{"id": "8029e455-c16d-48cd-93e1-cf56c226cc4a", "address": "fa:16:3e:38:a4:82", "network": {"id": "7692c2b5-931d-4d1d-aae6-384ce4ff5ff0", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-144924554-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e6fc24a9e1b646a2a08df4f53f712267", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8029e455-c1", "ovs_interfaceid": "8029e455-c16d-48cd-93e1-cf56c226cc4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 18 16:15:07 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-e769d872-bd6d-48a1-ae5e-f0493a7bc3e6 req-6b7a0d93-74a2-4e15-abbc-6b023bd42c68 service nova] Acquired lock "refresh_cache-8aaa4e97-9439-4760-9e05-8b248b02074f" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:15:07 user nova-compute[70975]: DEBUG nova.network.neutron [req-e769d872-bd6d-48a1-ae5e-f0493a7bc3e6 req-6b7a0d93-74a2-4e15-abbc-6b023bd42c68 service nova] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Refreshing network info cache for port 8029e455-c16d-48cd-93e1-cf56c226cc4a {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 18 16:15:07 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Start _get_guest_xml network_info=[{"id": "8029e455-c16d-48cd-93e1-cf56c226cc4a", "address": "fa:16:3e:38:a4:82", "network": {"id": "7692c2b5-931d-4d1d-aae6-384ce4ff5ff0", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-144924554-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e6fc24a9e1b646a2a08df4f53f712267", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8029e455-c1", "ovs_interfaceid": "8029e455-c16d-48cd-93e1-cf56c226cc4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encrypted': False, 'device_type': 'disk', 'encryption_secret_uuid': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'b11a20de-f82a-4158-b53e-0a0c7a1552cb'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 18 16:15:07 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:15:07 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:15:07 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70975) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 18 16:15:07 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-18T16:11:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=), allow threads: True {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 18 16:15:07 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Flavor limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 18 16:15:07 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Image limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 18 16:15:07 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Flavor pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 18 16:15:07 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Image pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 18 16:15:07 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 18 16:15:07 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 18 16:15:07 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 18 16:15:07 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Got 1 possible topologies {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 18 16:15:07 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 18 16:15:07 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 18 16:15:07 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:15:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1796838032',display_name='tempest-ServerStableDeviceRescueTest-server-1796838032',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverstabledevicerescuetest-server-1796838032',id=9,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMRdR0cjFHm3mhHSll5gh7yZMFO8YnbHGZrzqn4BUKzi/NqN6epqJPxISmge123Mh6ultuf3msUKM4SPDGPvR5esoWMysquk2JzsFDlVx2V3n3YOLa1rlzu338dq4Z9bHg==',key_name='tempest-keypair-1985020567',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e6fc24a9e1b646a2a08df4f53f712267',ramdisk_id='',reservation_id='r-87uejcsr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-1233154848',owner_user_name='tempest-ServerStableDeviceRescueTest-1233154848-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:15:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6a284b1ad50e463894f8d58d38a57d7c',uuid=8aaa4e97-9439-4760-9e05-8b248b02074f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8029e455-c16d-48cd-93e1-cf56c226cc4a", "address": "fa:16:3e:38:a4:82", "network": {"id": "7692c2b5-931d-4d1d-aae6-384ce4ff5ff0", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-144924554-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e6fc24a9e1b646a2a08df4f53f712267", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8029e455-c1", "ovs_interfaceid": "8029e455-c16d-48cd-93e1-cf56c226cc4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70975) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 18 16:15:07 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Converting VIF {"id": "8029e455-c16d-48cd-93e1-cf56c226cc4a", "address": "fa:16:3e:38:a4:82", "network": {"id": "7692c2b5-931d-4d1d-aae6-384ce4ff5ff0", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-144924554-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e6fc24a9e1b646a2a08df4f53f712267", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8029e455-c1", "ovs_interfaceid": "8029e455-c16d-48cd-93e1-cf56c226cc4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:15:07 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:a4:82,bridge_name='br-int',has_traffic_filtering=True,id=8029e455-c16d-48cd-93e1-cf56c226cc4a,network=Network(7692c2b5-931d-4d1d-aae6-384ce4ff5ff0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8029e455-c1') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:15:07 user nova-compute[70975]: DEBUG nova.objects.instance [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Lazy-loading 'pci_devices' on Instance uuid 8aaa4e97-9439-4760-9e05-8b248b02074f {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:15:08 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] End _get_guest_xml xml= Apr 18 16:15:08 user nova-compute[70975]: 8aaa4e97-9439-4760-9e05-8b248b02074f Apr 18 16:15:08 user nova-compute[70975]: instance-00000009 Apr 18 16:15:08 user nova-compute[70975]: 131072 Apr 18 16:15:08 user nova-compute[70975]: 1 Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: tempest-ServerStableDeviceRescueTest-server-1796838032 Apr 18 16:15:08 user nova-compute[70975]: 2023-04-18 16:15:07 Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: 128 Apr 18 16:15:08 user nova-compute[70975]: 1 Apr 18 16:15:08 user nova-compute[70975]: 0 Apr 18 16:15:08 user nova-compute[70975]: 0 Apr 18 16:15:08 user nova-compute[70975]: 1 Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: tempest-ServerStableDeviceRescueTest-1233154848-project-member Apr 18 16:15:08 user nova-compute[70975]: tempest-ServerStableDeviceRescueTest-1233154848 Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: OpenStack Foundation Apr 18 16:15:08 user nova-compute[70975]: OpenStack Nova Apr 18 16:15:08 user nova-compute[70975]: 0.0.0 Apr 18 16:15:08 user nova-compute[70975]: 8aaa4e97-9439-4760-9e05-8b248b02074f Apr 18 16:15:08 user nova-compute[70975]: 8aaa4e97-9439-4760-9e05-8b248b02074f Apr 18 16:15:08 user nova-compute[70975]: Virtual Machine Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: hvm Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Nehalem Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: /dev/urandom Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: Apr 18 16:15:08 user nova-compute[70975]: {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 18 16:15:08 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:15:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1796838032',display_name='tempest-ServerStableDeviceRescueTest-server-1796838032',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverstabledevicerescuetest-server-1796838032',id=9,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMRdR0cjFHm3mhHSll5gh7yZMFO8YnbHGZrzqn4BUKzi/NqN6epqJPxISmge123Mh6ultuf3msUKM4SPDGPvR5esoWMysquk2JzsFDlVx2V3n3YOLa1rlzu338dq4Z9bHg==',key_name='tempest-keypair-1985020567',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e6fc24a9e1b646a2a08df4f53f712267',ramdisk_id='',reservation_id='r-87uejcsr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-1233154848',owner_user_name='tempest-ServerStableDeviceRescueTest-1233154848-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:15:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6a284b1ad50e463894f8d58d38a57d7c',uuid=8aaa4e97-9439-4760-9e05-8b248b02074f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8029e455-c16d-48cd-93e1-cf56c226cc4a", "address": "fa:16:3e:38:a4:82", "network": {"id": "7692c2b5-931d-4d1d-aae6-384ce4ff5ff0", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-144924554-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e6fc24a9e1b646a2a08df4f53f712267", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8029e455-c1", "ovs_interfaceid": "8029e455-c16d-48cd-93e1-cf56c226cc4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 18 16:15:08 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Converting VIF {"id": "8029e455-c16d-48cd-93e1-cf56c226cc4a", "address": "fa:16:3e:38:a4:82", "network": {"id": "7692c2b5-931d-4d1d-aae6-384ce4ff5ff0", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-144924554-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e6fc24a9e1b646a2a08df4f53f712267", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8029e455-c1", "ovs_interfaceid": "8029e455-c16d-48cd-93e1-cf56c226cc4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:15:08 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:a4:82,bridge_name='br-int',has_traffic_filtering=True,id=8029e455-c16d-48cd-93e1-cf56c226cc4a,network=Network(7692c2b5-931d-4d1d-aae6-384ce4ff5ff0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8029e455-c1') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:15:08 user nova-compute[70975]: DEBUG os_vif [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:a4:82,bridge_name='br-int',has_traffic_filtering=True,id=8029e455-c16d-48cd-93e1-cf56c226cc4a,network=Network(7692c2b5-931d-4d1d-aae6-384ce4ff5ff0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8029e455-c1') {{(pid=70975) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 18 16:15:08 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:08 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:15:08 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 18 16:15:08 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:08 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8029e455-c1, may_exist=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:15:08 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8029e455-c1, col_values=(('external_ids', {'iface-id': '8029e455-c16d-48cd-93e1-cf56c226cc4a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:38:a4:82', 'vm-uuid': '8aaa4e97-9439-4760-9e05-8b248b02074f'}),)) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:15:08 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:08 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:15:08 user nova-compute[70975]: INFO os_vif [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:a4:82,bridge_name='br-int',has_traffic_filtering=True,id=8029e455-c16d-48cd-93e1-cf56c226cc4a,network=Network(7692c2b5-931d-4d1d-aae6-384ce4ff5ff0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8029e455-c1') Apr 18 16:15:08 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] No BDM found with device name vda, not building metadata. {{(pid=70975) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 18 16:15:08 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] No VIF found with MAC fa:16:3e:38:a4:82, not building metadata {{(pid=70975) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 18 16:15:08 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Resumed> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:15:08 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] VM Resumed (Lifecycle Event) Apr 18 16:15:08 user nova-compute[70975]: DEBUG nova.compute.manager [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Instance event wait completed in 0 seconds for {{(pid=70975) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 18 16:15:08 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Guest created on hypervisor {{(pid=70975) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 18 16:15:08 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Instance spawned successfully. Apr 18 16:15:08 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 18 16:15:08 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:15:08 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:15:08 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:15:08 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Started> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:15:08 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] VM Started (Lifecycle Event) Apr 18 16:15:08 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Found default for hw_cdrom_bus of ide {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:15:08 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Found default for hw_disk_bus of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:15:08 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Found default for hw_input_bus of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:15:08 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Found default for hw_pointer_model of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:15:08 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Found default for hw_video_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:15:08 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Found default for hw_vif_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:15:08 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:15:08 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:15:08 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:15:08 user nova-compute[70975]: INFO nova.compute.manager [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Took 7.58 seconds to spawn the instance on the hypervisor. Apr 18 16:15:08 user nova-compute[70975]: DEBUG nova.compute.manager [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:15:08 user nova-compute[70975]: DEBUG nova.network.neutron [req-e769d872-bd6d-48a1-ae5e-f0493a7bc3e6 req-6b7a0d93-74a2-4e15-abbc-6b023bd42c68 service nova] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Updated VIF entry in instance network info cache for port 8029e455-c16d-48cd-93e1-cf56c226cc4a. {{(pid=70975) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 18 16:15:08 user nova-compute[70975]: DEBUG nova.network.neutron [req-e769d872-bd6d-48a1-ae5e-f0493a7bc3e6 req-6b7a0d93-74a2-4e15-abbc-6b023bd42c68 service nova] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Updating instance_info_cache with network_info: [{"id": "8029e455-c16d-48cd-93e1-cf56c226cc4a", "address": "fa:16:3e:38:a4:82", "network": {"id": "7692c2b5-931d-4d1d-aae6-384ce4ff5ff0", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-144924554-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e6fc24a9e1b646a2a08df4f53f712267", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8029e455-c1", "ovs_interfaceid": "8029e455-c16d-48cd-93e1-cf56c226cc4a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:15:08 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-e769d872-bd6d-48a1-ae5e-f0493a7bc3e6 req-6b7a0d93-74a2-4e15-abbc-6b023bd42c68 service nova] Releasing lock "refresh_cache-8aaa4e97-9439-4760-9e05-8b248b02074f" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:15:08 user nova-compute[70975]: INFO nova.compute.manager [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Took 8.69 seconds to build instance. Apr 18 16:15:08 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3d264582-b94d-46c4-b575-32c05d164bdc tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lock "8e1ccfc5-90a7-443f-83e2-c07be27d6c7c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 8.837s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:15:08 user nova-compute[70975]: DEBUG nova.compute.manager [req-8fa8ac4f-587b-4757-bb53-321acf0f143f req-d803acdf-6840-4530-af28-afa6732cdfa8 service nova] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Received event network-vif-plugged-13606f1d-602f-4c77-b90b-32322653e54e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:15:08 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-8fa8ac4f-587b-4757-bb53-321acf0f143f req-d803acdf-6840-4530-af28-afa6732cdfa8 service nova] Acquiring lock "8e1ccfc5-90a7-443f-83e2-c07be27d6c7c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:15:08 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-8fa8ac4f-587b-4757-bb53-321acf0f143f req-d803acdf-6840-4530-af28-afa6732cdfa8 service nova] Lock "8e1ccfc5-90a7-443f-83e2-c07be27d6c7c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:15:08 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-8fa8ac4f-587b-4757-bb53-321acf0f143f req-d803acdf-6840-4530-af28-afa6732cdfa8 service nova] Lock "8e1ccfc5-90a7-443f-83e2-c07be27d6c7c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:15:08 user nova-compute[70975]: DEBUG nova.compute.manager [req-8fa8ac4f-587b-4757-bb53-321acf0f143f req-d803acdf-6840-4530-af28-afa6732cdfa8 service nova] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] No waiting events found dispatching network-vif-plugged-13606f1d-602f-4c77-b90b-32322653e54e {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:15:08 user nova-compute[70975]: WARNING nova.compute.manager [req-8fa8ac4f-587b-4757-bb53-321acf0f143f req-d803acdf-6840-4530-af28-afa6732cdfa8 service nova] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Received unexpected event network-vif-plugged-13606f1d-602f-4c77-b90b-32322653e54e for instance with vm_state active and task_state None. Apr 18 16:15:09 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:09 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:09 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:09 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:09 user nova-compute[70975]: DEBUG nova.compute.manager [req-b89af778-b73d-43a4-81c9-da793618b697 req-a697563e-2173-4fad-988c-8a3c5d27ba9b service nova] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Received event network-vif-plugged-8029e455-c16d-48cd-93e1-cf56c226cc4a {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:15:09 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-b89af778-b73d-43a4-81c9-da793618b697 req-a697563e-2173-4fad-988c-8a3c5d27ba9b service nova] Acquiring lock "8aaa4e97-9439-4760-9e05-8b248b02074f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:15:09 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-b89af778-b73d-43a4-81c9-da793618b697 req-a697563e-2173-4fad-988c-8a3c5d27ba9b service nova] Lock "8aaa4e97-9439-4760-9e05-8b248b02074f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:15:09 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-b89af778-b73d-43a4-81c9-da793618b697 req-a697563e-2173-4fad-988c-8a3c5d27ba9b service nova] Lock "8aaa4e97-9439-4760-9e05-8b248b02074f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:15:09 user nova-compute[70975]: DEBUG nova.compute.manager [req-b89af778-b73d-43a4-81c9-da793618b697 req-a697563e-2173-4fad-988c-8a3c5d27ba9b service nova] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] No waiting events found dispatching network-vif-plugged-8029e455-c16d-48cd-93e1-cf56c226cc4a {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:15:09 user nova-compute[70975]: WARNING nova.compute.manager [req-b89af778-b73d-43a4-81c9-da793618b697 req-a697563e-2173-4fad-988c-8a3c5d27ba9b service nova] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Received unexpected event network-vif-plugged-8029e455-c16d-48cd-93e1-cf56c226cc4a for instance with vm_state building and task_state spawning. Apr 18 16:15:10 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:10 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:10 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:10 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:11 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Resumed> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:15:11 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] VM Resumed (Lifecycle Event) Apr 18 16:15:11 user nova-compute[70975]: DEBUG nova.compute.manager [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Instance event wait completed in 0 seconds for {{(pid=70975) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 18 16:15:11 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Guest created on hypervisor {{(pid=70975) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 18 16:15:11 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Instance spawned successfully. Apr 18 16:15:11 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 18 16:15:11 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:15:11 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:15:11 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Found default for hw_cdrom_bus of ide {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:15:11 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Found default for hw_disk_bus of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:15:11 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Found default for hw_input_bus of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:15:11 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Found default for hw_pointer_model of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:15:11 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Found default for hw_video_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:15:11 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Found default for hw_vif_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:15:11 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:15:11 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Started> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:15:11 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] VM Started (Lifecycle Event) Apr 18 16:15:11 user nova-compute[70975]: DEBUG nova.compute.manager [req-b5114c83-f5f1-4ee6-8709-23ecb3330295 req-cc3a6c95-9677-4c86-9b1e-4cbb3dff16dc service nova] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Received event network-vif-plugged-8029e455-c16d-48cd-93e1-cf56c226cc4a {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:15:11 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-b5114c83-f5f1-4ee6-8709-23ecb3330295 req-cc3a6c95-9677-4c86-9b1e-4cbb3dff16dc service nova] Acquiring lock "8aaa4e97-9439-4760-9e05-8b248b02074f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:15:11 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-b5114c83-f5f1-4ee6-8709-23ecb3330295 req-cc3a6c95-9677-4c86-9b1e-4cbb3dff16dc service nova] Lock "8aaa4e97-9439-4760-9e05-8b248b02074f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:15:11 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-b5114c83-f5f1-4ee6-8709-23ecb3330295 req-cc3a6c95-9677-4c86-9b1e-4cbb3dff16dc service nova] Lock "8aaa4e97-9439-4760-9e05-8b248b02074f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:15:11 user nova-compute[70975]: DEBUG nova.compute.manager [req-b5114c83-f5f1-4ee6-8709-23ecb3330295 req-cc3a6c95-9677-4c86-9b1e-4cbb3dff16dc service nova] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] No waiting events found dispatching network-vif-plugged-8029e455-c16d-48cd-93e1-cf56c226cc4a {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:15:11 user nova-compute[70975]: WARNING nova.compute.manager [req-b5114c83-f5f1-4ee6-8709-23ecb3330295 req-cc3a6c95-9677-4c86-9b1e-4cbb3dff16dc service nova] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Received unexpected event network-vif-plugged-8029e455-c16d-48cd-93e1-cf56c226cc4a for instance with vm_state building and task_state spawning. Apr 18 16:15:11 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:15:11 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:15:11 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:15:11 user nova-compute[70975]: INFO nova.compute.manager [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Took 5.85 seconds to spawn the instance on the hypervisor. Apr 18 16:15:11 user nova-compute[70975]: DEBUG nova.compute.manager [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:15:11 user nova-compute[70975]: INFO nova.compute.manager [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Took 6.65 seconds to build instance. Apr 18 16:15:11 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-4bd53016-676b-4566-b265-a7991ed52055 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Lock "8aaa4e97-9439-4760-9e05-8b248b02074f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.757s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:15:12 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:13 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:18 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:22 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:23 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:27 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:28 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:32 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:33 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:37 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:38 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:42 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:43 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:47 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:48 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-fc318e12-cc56-4100-b455-0975bc1097d0 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Acquiring lock "b9feb20a-78c0-44ac-ab87-3a68a14396aa" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:15:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-fc318e12-cc56-4100-b455-0975bc1097d0 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Lock "b9feb20a-78c0-44ac-ab87-3a68a14396aa" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:15:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-fc318e12-cc56-4100-b455-0975bc1097d0 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Acquiring lock "b9feb20a-78c0-44ac-ab87-3a68a14396aa-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:15:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-fc318e12-cc56-4100-b455-0975bc1097d0 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Lock "b9feb20a-78c0-44ac-ab87-3a68a14396aa-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:15:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-fc318e12-cc56-4100-b455-0975bc1097d0 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Lock "b9feb20a-78c0-44ac-ab87-3a68a14396aa-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:15:48 user nova-compute[70975]: INFO nova.compute.manager [None req-fc318e12-cc56-4100-b455-0975bc1097d0 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Terminating instance Apr 18 16:15:48 user nova-compute[70975]: DEBUG nova.compute.manager [None req-fc318e12-cc56-4100-b455-0975bc1097d0 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Start destroying the instance on the hypervisor. {{(pid=70975) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 18 16:15:48 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:48 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:48 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:48 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:48 user nova-compute[70975]: DEBUG nova.compute.manager [req-cde2cd84-2d59-4c89-847f-e152ff07ba5b req-957155fb-e4b3-446f-9084-1e50e217876c service nova] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Received event network-vif-unplugged-cba845fa-9bbc-4e86-9fc9-f9458343fcc9 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:15:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-cde2cd84-2d59-4c89-847f-e152ff07ba5b req-957155fb-e4b3-446f-9084-1e50e217876c service nova] Acquiring lock "b9feb20a-78c0-44ac-ab87-3a68a14396aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:15:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-cde2cd84-2d59-4c89-847f-e152ff07ba5b req-957155fb-e4b3-446f-9084-1e50e217876c service nova] Lock "b9feb20a-78c0-44ac-ab87-3a68a14396aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:15:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-cde2cd84-2d59-4c89-847f-e152ff07ba5b req-957155fb-e4b3-446f-9084-1e50e217876c service nova] Lock "b9feb20a-78c0-44ac-ab87-3a68a14396aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:15:48 user nova-compute[70975]: DEBUG nova.compute.manager [req-cde2cd84-2d59-4c89-847f-e152ff07ba5b req-957155fb-e4b3-446f-9084-1e50e217876c service nova] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] No waiting events found dispatching network-vif-unplugged-cba845fa-9bbc-4e86-9fc9-f9458343fcc9 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:15:48 user nova-compute[70975]: DEBUG nova.compute.manager [req-cde2cd84-2d59-4c89-847f-e152ff07ba5b req-957155fb-e4b3-446f-9084-1e50e217876c service nova] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Received event network-vif-unplugged-cba845fa-9bbc-4e86-9fc9-f9458343fcc9 for instance with task_state deleting. {{(pid=70975) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 18 16:15:48 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Instance destroyed successfully. Apr 18 16:15:48 user nova-compute[70975]: DEBUG nova.objects.instance [None req-fc318e12-cc56-4100-b455-0975bc1097d0 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Lazy-loading 'resources' on Instance uuid b9feb20a-78c0-44ac-ab87-3a68a14396aa {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:15:48 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-fc318e12-cc56-4100-b455-0975bc1097d0 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:13:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-421490992',display_name='tempest-DeleteServersTestJSON-server-421490992',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-deleteserverstestjson-server-421490992',id=1,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-18T16:14:10Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='eb907be282bb4348976527807993ee58',ramdisk_id='',reservation_id='r-7jxbdvnp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-DeleteServersTestJSON-1528617807',owner_user_name='tempest-DeleteServersTestJSON-1528617807-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-18T16:14:11Z,user_data=None,user_id='045e13d387f04d8eb0709154e4114bf5',uuid=b9feb20a-78c0-44ac-ab87-3a68a14396aa,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cba845fa-9bbc-4e86-9fc9-f9458343fcc9", "address": "fa:16:3e:db:e1:db", "network": {"id": "16a8b366-68dd-415f-bae3-c01a7603f384", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1737580312-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "eb907be282bb4348976527807993ee58", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcba845fa-9b", "ovs_interfaceid": "cba845fa-9bbc-4e86-9fc9-f9458343fcc9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 18 16:15:48 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-fc318e12-cc56-4100-b455-0975bc1097d0 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Converting VIF {"id": "cba845fa-9bbc-4e86-9fc9-f9458343fcc9", "address": "fa:16:3e:db:e1:db", "network": {"id": "16a8b366-68dd-415f-bae3-c01a7603f384", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1737580312-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "eb907be282bb4348976527807993ee58", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcba845fa-9b", "ovs_interfaceid": "cba845fa-9bbc-4e86-9fc9-f9458343fcc9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:15:48 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-fc318e12-cc56-4100-b455-0975bc1097d0 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:db:e1:db,bridge_name='br-int',has_traffic_filtering=True,id=cba845fa-9bbc-4e86-9fc9-f9458343fcc9,network=Network(16a8b366-68dd-415f-bae3-c01a7603f384),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcba845fa-9b') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:15:48 user nova-compute[70975]: DEBUG os_vif [None req-fc318e12-cc56-4100-b455-0975bc1097d0 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:db:e1:db,bridge_name='br-int',has_traffic_filtering=True,id=cba845fa-9bbc-4e86-9fc9-f9458343fcc9,network=Network(16a8b366-68dd-415f-bae3-c01a7603f384),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcba845fa-9b') {{(pid=70975) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 18 16:15:48 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:48 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcba845fa-9b, bridge=br-int, if_exists=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:15:48 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:48 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:15:48 user nova-compute[70975]: INFO os_vif [None req-fc318e12-cc56-4100-b455-0975bc1097d0 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:db:e1:db,bridge_name='br-int',has_traffic_filtering=True,id=cba845fa-9bbc-4e86-9fc9-f9458343fcc9,network=Network(16a8b366-68dd-415f-bae3-c01a7603f384),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcba845fa-9b') Apr 18 16:15:48 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-fc318e12-cc56-4100-b455-0975bc1097d0 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Deleting instance files /opt/stack/data/nova/instances/b9feb20a-78c0-44ac-ab87-3a68a14396aa_del Apr 18 16:15:48 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-fc318e12-cc56-4100-b455-0975bc1097d0 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Deletion of /opt/stack/data/nova/instances/b9feb20a-78c0-44ac-ab87-3a68a14396aa_del complete Apr 18 16:15:48 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-fc318e12-cc56-4100-b455-0975bc1097d0 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Checking UEFI support for host arch (x86_64) {{(pid=70975) supports_uefi /opt/stack/nova/nova/virt/libvirt/host.py:1722}} Apr 18 16:15:48 user nova-compute[70975]: INFO nova.virt.libvirt.host [None req-fc318e12-cc56-4100-b455-0975bc1097d0 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] UEFI support detected Apr 18 16:15:48 user nova-compute[70975]: INFO nova.compute.manager [None req-fc318e12-cc56-4100-b455-0975bc1097d0 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Took 0.67 seconds to destroy the instance on the hypervisor. Apr 18 16:15:48 user nova-compute[70975]: DEBUG oslo.service.loopingcall [None req-fc318e12-cc56-4100-b455-0975bc1097d0 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70975) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 18 16:15:48 user nova-compute[70975]: DEBUG nova.compute.manager [-] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Deallocating network for instance {{(pid=70975) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 18 16:15:48 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] deallocate_for_instance() {{(pid=70975) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 18 16:15:49 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Updating instance_info_cache with network_info: [] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:15:49 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:49 user nova-compute[70975]: DEBUG nova.compute.manager [req-18a6ad5e-0eeb-4efc-8d33-c8d846d4a110 req-1acffa70-db51-4b22-bb1b-26de858f750e service nova] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Received event network-vif-deleted-cba845fa-9bbc-4e86-9fc9-f9458343fcc9 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:15:49 user nova-compute[70975]: INFO nova.compute.manager [req-18a6ad5e-0eeb-4efc-8d33-c8d846d4a110 req-1acffa70-db51-4b22-bb1b-26de858f750e service nova] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Neutron deleted interface cba845fa-9bbc-4e86-9fc9-f9458343fcc9; detaching it from the instance and deleting it from the info cache Apr 18 16:15:49 user nova-compute[70975]: DEBUG nova.network.neutron [req-18a6ad5e-0eeb-4efc-8d33-c8d846d4a110 req-1acffa70-db51-4b22-bb1b-26de858f750e service nova] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Updating instance_info_cache with network_info: [] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:15:49 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Took 1.03 seconds to deallocate network for instance. Apr 18 16:15:49 user nova-compute[70975]: DEBUG nova.compute.manager [req-18a6ad5e-0eeb-4efc-8d33-c8d846d4a110 req-1acffa70-db51-4b22-bb1b-26de858f750e service nova] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Detach interface failed, port_id=cba845fa-9bbc-4e86-9fc9-f9458343fcc9, reason: Instance b9feb20a-78c0-44ac-ab87-3a68a14396aa could not be found. {{(pid=70975) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 18 16:15:49 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-fc318e12-cc56-4100-b455-0975bc1097d0 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:15:49 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-fc318e12-cc56-4100-b455-0975bc1097d0 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:15:50 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-fc318e12-cc56-4100-b455-0975bc1097d0 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:15:50 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-fc318e12-cc56-4100-b455-0975bc1097d0 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:15:50 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-fc318e12-cc56-4100-b455-0975bc1097d0 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.295s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:15:50 user nova-compute[70975]: INFO nova.scheduler.client.report [None req-fc318e12-cc56-4100-b455-0975bc1097d0 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Deleted allocations for instance b9feb20a-78c0-44ac-ab87-3a68a14396aa Apr 18 16:15:50 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-fc318e12-cc56-4100-b455-0975bc1097d0 tempest-DeleteServersTestJSON-1528617807 tempest-DeleteServersTestJSON-1528617807-project-member] Lock "b9feb20a-78c0-44ac-ab87-3a68a14396aa" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.370s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:15:50 user nova-compute[70975]: DEBUG nova.compute.manager [req-c6cbcd95-9822-4c05-ab00-2b90cdb9395f req-f91bb59c-d1b3-4042-9a00-4fa1192937eb service nova] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Received event network-vif-plugged-cba845fa-9bbc-4e86-9fc9-f9458343fcc9 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:15:50 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-c6cbcd95-9822-4c05-ab00-2b90cdb9395f req-f91bb59c-d1b3-4042-9a00-4fa1192937eb service nova] Acquiring lock "b9feb20a-78c0-44ac-ab87-3a68a14396aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:15:50 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-c6cbcd95-9822-4c05-ab00-2b90cdb9395f req-f91bb59c-d1b3-4042-9a00-4fa1192937eb service nova] Lock "b9feb20a-78c0-44ac-ab87-3a68a14396aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:15:50 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-c6cbcd95-9822-4c05-ab00-2b90cdb9395f req-f91bb59c-d1b3-4042-9a00-4fa1192937eb service nova] Lock "b9feb20a-78c0-44ac-ab87-3a68a14396aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:15:50 user nova-compute[70975]: DEBUG nova.compute.manager [req-c6cbcd95-9822-4c05-ab00-2b90cdb9395f req-f91bb59c-d1b3-4042-9a00-4fa1192937eb service nova] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] No waiting events found dispatching network-vif-plugged-cba845fa-9bbc-4e86-9fc9-f9458343fcc9 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:15:50 user nova-compute[70975]: WARNING nova.compute.manager [req-c6cbcd95-9822-4c05-ab00-2b90cdb9395f req-f91bb59c-d1b3-4042-9a00-4fa1192937eb service nova] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Received unexpected event network-vif-plugged-cba845fa-9bbc-4e86-9fc9-f9458343fcc9 for instance with vm_state deleted and task_state None. Apr 18 16:15:52 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:53 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:57 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:58 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:15:59 user nova-compute[70975]: DEBUG nova.compute.manager [req-98bc4293-9f8e-439e-af83-af9edf777672 req-479e8751-b80c-4212-ad12-29bc94be7a38 service nova] [instance: 1b530349-680e-4def-86ef-29c340efa175] Received event network-changed-64d26c20-add4-4a63-bace-6a3678032692 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:15:59 user nova-compute[70975]: DEBUG nova.compute.manager [req-98bc4293-9f8e-439e-af83-af9edf777672 req-479e8751-b80c-4212-ad12-29bc94be7a38 service nova] [instance: 1b530349-680e-4def-86ef-29c340efa175] Refreshing instance network info cache due to event network-changed-64d26c20-add4-4a63-bace-6a3678032692. {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 18 16:15:59 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-98bc4293-9f8e-439e-af83-af9edf777672 req-479e8751-b80c-4212-ad12-29bc94be7a38 service nova] Acquiring lock "refresh_cache-1b530349-680e-4def-86ef-29c340efa175" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:15:59 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-98bc4293-9f8e-439e-af83-af9edf777672 req-479e8751-b80c-4212-ad12-29bc94be7a38 service nova] Acquired lock "refresh_cache-1b530349-680e-4def-86ef-29c340efa175" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:15:59 user nova-compute[70975]: DEBUG nova.network.neutron [req-98bc4293-9f8e-439e-af83-af9edf777672 req-479e8751-b80c-4212-ad12-29bc94be7a38 service nova] [instance: 1b530349-680e-4def-86ef-29c340efa175] Refreshing network info cache for port 64d26c20-add4-4a63-bace-6a3678032692 {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 18 16:15:59 user nova-compute[70975]: DEBUG nova.network.neutron [req-98bc4293-9f8e-439e-af83-af9edf777672 req-479e8751-b80c-4212-ad12-29bc94be7a38 service nova] [instance: 1b530349-680e-4def-86ef-29c340efa175] Updated VIF entry in instance network info cache for port 64d26c20-add4-4a63-bace-6a3678032692. {{(pid=70975) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 18 16:15:59 user nova-compute[70975]: DEBUG nova.network.neutron [req-98bc4293-9f8e-439e-af83-af9edf777672 req-479e8751-b80c-4212-ad12-29bc94be7a38 service nova] [instance: 1b530349-680e-4def-86ef-29c340efa175] Updating instance_info_cache with network_info: [{"id": "64d26c20-add4-4a63-bace-6a3678032692", "address": "fa:16:3e:33:ec:46", "network": {"id": "f5beaf4a-eeaf-454b-bde5-dd5e1f15e9dd", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-215585786-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "caa61b19cc4e4cd4bb7d41291c40ef1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap64d26c20-ad", "ovs_interfaceid": "64d26c20-add4-4a63-bace-6a3678032692", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:15:59 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-98bc4293-9f8e-439e-af83-af9edf777672 req-479e8751-b80c-4212-ad12-29bc94be7a38 service nova] Releasing lock "refresh_cache-1b530349-680e-4def-86ef-29c340efa175" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:16:00 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:16:00 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:16:00 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Starting heal instance info cache {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 18 16:16:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "refresh_cache-da82d905-1ca1-403d-9598-7561e69b9704" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:16:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquired lock "refresh_cache-da82d905-1ca1-403d-9598-7561e69b9704" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:16:00 user nova-compute[70975]: DEBUG nova.network.neutron [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Forcefully refreshing network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 18 16:16:00 user nova-compute[70975]: DEBUG nova.network.neutron [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Updating instance_info_cache with network_info: [{"id": "894e80db-f051-4b32-adc8-e3afa321eb34", "address": "fa:16:3e:ad:ba:71", "network": {"id": "1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1814061150-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "261e8ba82d9e4203917afb0241a3b4fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap894e80db-f0", "ovs_interfaceid": "894e80db-f051-4b32-adc8-e3afa321eb34", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:16:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Releasing lock "refresh_cache-da82d905-1ca1-403d-9598-7561e69b9704" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:16:00 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Updated the network info_cache for instance {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 18 16:16:00 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:16:00 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:16:00 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:16:00 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:16:00 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:16:00 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:16:00 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70975) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 18 16:16:00 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager.update_available_resource {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:16:01 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:16:01 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:16:01 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:16:01 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Auditing locally available compute resources for user (node: user) {{(pid=70975) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 18 16:16:01 user nova-compute[70975]: INFO nova.compute.manager [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Rescuing Apr 18 16:16:01 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Acquiring lock "refresh_cache-d7a293bf-a9bd-424e-ba11-bbed7dfea41c" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:16:01 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Acquired lock "refresh_cache-d7a293bf-a9bd-424e-ba11-bbed7dfea41c" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:16:01 user nova-compute[70975]: DEBUG nova.network.neutron [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Building network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 18 16:16:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8e1ccfc5-90a7-443f-83e2-c07be27d6c7c/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:16:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8e1ccfc5-90a7-443f-83e2-c07be27d6c7c/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:16:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8e1ccfc5-90a7-443f-83e2-c07be27d6c7c/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:16:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8e1ccfc5-90a7-443f-83e2-c07be27d6c7c/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:16:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/993d062c-8462-4534-bcde-9249779d4e90/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:16:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/993d062c-8462-4534-bcde-9249779d4e90/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:16:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/993d062c-8462-4534-bcde-9249779d4e90/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:16:01 user nova-compute[70975]: DEBUG nova.network.neutron [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Updating instance_info_cache with network_info: [{"id": "e5d69d5c-1a5c-4300-ab15-e73f78388f0e", "address": "fa:16:3e:92:2d:7f", "network": {"id": "1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1814061150-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "261e8ba82d9e4203917afb0241a3b4fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape5d69d5c-1a", "ovs_interfaceid": "e5d69d5c-1a5c-4300-ab15-e73f78388f0e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:16:01 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Releasing lock "refresh_cache-d7a293bf-a9bd-424e-ba11-bbed7dfea41c" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:16:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/993d062c-8462-4534-bcde-9249779d4e90/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:16:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:16:01 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk --force-share --output=json" returned: 0 in 0.148s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:16:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:16:01 user nova-compute[70975]: DEBUG nova.compute.manager [req-9792d29c-5bb5-406d-89e4-3a6799272601 req-c1d50670-d22d-460c-af0e-b1880e017437 service nova] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Received event network-vif-unplugged-e5d69d5c-1a5c-4300-ab15-e73f78388f0e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:16:01 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-9792d29c-5bb5-406d-89e4-3a6799272601 req-c1d50670-d22d-460c-af0e-b1880e017437 service nova] Acquiring lock "d7a293bf-a9bd-424e-ba11-bbed7dfea41c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:16:01 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-9792d29c-5bb5-406d-89e4-3a6799272601 req-c1d50670-d22d-460c-af0e-b1880e017437 service nova] Lock "d7a293bf-a9bd-424e-ba11-bbed7dfea41c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:16:01 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-9792d29c-5bb5-406d-89e4-3a6799272601 req-c1d50670-d22d-460c-af0e-b1880e017437 service nova] Lock "d7a293bf-a9bd-424e-ba11-bbed7dfea41c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:16:01 user nova-compute[70975]: DEBUG nova.compute.manager [req-9792d29c-5bb5-406d-89e4-3a6799272601 req-c1d50670-d22d-460c-af0e-b1880e017437 service nova] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] No waiting events found dispatching network-vif-unplugged-e5d69d5c-1a5c-4300-ab15-e73f78388f0e {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:16:01 user nova-compute[70975]: WARNING nova.compute.manager [req-9792d29c-5bb5-406d-89e4-3a6799272601 req-c1d50670-d22d-460c-af0e-b1880e017437 service nova] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Received unexpected event network-vif-unplugged-e5d69d5c-1a5c-4300-ab15-e73f78388f0e for instance with vm_state active and task_state rescuing. Apr 18 16:16:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/da82d905-1ca1-403d-9598-7561e69b9704/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/da82d905-1ca1-403d-9598-7561e69b9704/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/da82d905-1ca1-403d-9598-7561e69b9704/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:16:02 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Instance destroyed successfully. Apr 18 16:16:02 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Attempting rescue Apr 18 16:16:02 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} {{(pid=70975) rescue /opt/stack/nova/nova/virt/libvirt/driver.py:4289}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Instance directory exists: not creating {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4694}} Apr 18 16:16:02 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Creating image(s) Apr 18 16:16:02 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Acquiring lock "/opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lock "/opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lock "/opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.007s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG nova.objects.instance [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lazy-loading 'trusted_certs' on Instance uuid d7a293bf-a9bd-424e-ba11-bbed7dfea41c {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/da82d905-1ca1-403d-9598-7561e69b9704/disk --force-share --output=json" returned: 0 in 0.148s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8aaa4e97-9439-4760-9e05-8b248b02074f/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Acquiring lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8aaa4e97-9439-4760-9e05-8b248b02074f/disk --force-share --output=json" returned: 0 in 0.128s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8aaa4e97-9439-4760-9e05-8b248b02074f/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.126s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk.rescue {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk.rescue" returned: 0 in 0.048s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.180s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG nova.objects.instance [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lazy-loading 'migration_context' on Instance uuid d7a293bf-a9bd-424e-ba11-bbed7dfea41c {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Created local disks {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Start _get_guest_xml network_info=[{"id": "e5d69d5c-1a5c-4300-ab15-e73f78388f0e", "address": "fa:16:3e:92:2d:7f", "network": {"id": "1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1814061150-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1814061150-network", "vif_mac": "fa:16:3e:92:2d:7f"}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "261e8ba82d9e4203917afb0241a3b4fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape5d69d5c-1a", "ovs_interfaceid": "e5d69d5c-1a5c-4300-ab15-e73f78388f0e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=) rescue={'image_id': 'b11a20de-f82a-4158-b53e-0a0c7a1552cb', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG nova.objects.instance [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lazy-loading 'resources' on Instance uuid d7a293bf-a9bd-424e-ba11-bbed7dfea41c {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG nova.objects.instance [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lazy-loading 'numa_topology' on Instance uuid d7a293bf-a9bd-424e-ba11-bbed7dfea41c {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8aaa4e97-9439-4760-9e05-8b248b02074f/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:16:02 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:16:02 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:16:02 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70975) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-18T16:11:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=), allow threads: True {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Flavor limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Image limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Flavor pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Image pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Got 1 possible topologies {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG nova.objects.instance [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lazy-loading 'vcpu_model' on Instance uuid d7a293bf-a9bd-424e-ba11-bbed7dfea41c {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:14:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1351031695',display_name='tempest-ServerRescueNegativeTestJSON-server-1351031695',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-1351031695',id=4,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-18T16:14:23Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='261e8ba82d9e4203917afb0241a3b4fc',ramdisk_id='',reservation_id='r-aw8jyd7h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerRescueNegativeTestJSON-1586888284',owner_user_name='tempest-ServerRescueNegativeTestJSON-1586888284-project-member'},tags=,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:14:24Z,user_data=None,user_id='a8a3f45f9c6c431781fb582b8da22b0b',uuid=d7a293bf-a9bd-424e-ba11-bbed7dfea41c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e5d69d5c-1a5c-4300-ab15-e73f78388f0e", "address": "fa:16:3e:92:2d:7f", "network": {"id": "1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1814061150-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1814061150-network", "vif_mac": "fa:16:3e:92:2d:7f"}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "261e8ba82d9e4203917afb0241a3b4fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape5d69d5c-1a", "ovs_interfaceid": "e5d69d5c-1a5c-4300-ab15-e73f78388f0e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70975) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Converting VIF {"id": "e5d69d5c-1a5c-4300-ab15-e73f78388f0e", "address": "fa:16:3e:92:2d:7f", "network": {"id": "1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1814061150-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1814061150-network", "vif_mac": "fa:16:3e:92:2d:7f"}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "261e8ba82d9e4203917afb0241a3b4fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape5d69d5c-1a", "ovs_interfaceid": "e5d69d5c-1a5c-4300-ab15-e73f78388f0e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:92:2d:7f,bridge_name='br-int',has_traffic_filtering=True,id=e5d69d5c-1a5c-4300-ab15-e73f78388f0e,network=Network(1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5d69d5c-1a') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG nova.objects.instance [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lazy-loading 'pci_devices' on Instance uuid d7a293bf-a9bd-424e-ba11-bbed7dfea41c {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] End _get_guest_xml xml= Apr 18 16:16:02 user nova-compute[70975]: d7a293bf-a9bd-424e-ba11-bbed7dfea41c Apr 18 16:16:02 user nova-compute[70975]: instance-00000004 Apr 18 16:16:02 user nova-compute[70975]: 131072 Apr 18 16:16:02 user nova-compute[70975]: 1 Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: tempest-ServerRescueNegativeTestJSON-server-1351031695 Apr 18 16:16:02 user nova-compute[70975]: 2023-04-18 16:16:02 Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: 128 Apr 18 16:16:02 user nova-compute[70975]: 1 Apr 18 16:16:02 user nova-compute[70975]: 0 Apr 18 16:16:02 user nova-compute[70975]: 0 Apr 18 16:16:02 user nova-compute[70975]: 1 Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: tempest-ServerRescueNegativeTestJSON-1586888284-project-member Apr 18 16:16:02 user nova-compute[70975]: tempest-ServerRescueNegativeTestJSON-1586888284 Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: OpenStack Foundation Apr 18 16:16:02 user nova-compute[70975]: OpenStack Nova Apr 18 16:16:02 user nova-compute[70975]: 0.0.0 Apr 18 16:16:02 user nova-compute[70975]: d7a293bf-a9bd-424e-ba11-bbed7dfea41c Apr 18 16:16:02 user nova-compute[70975]: d7a293bf-a9bd-424e-ba11-bbed7dfea41c Apr 18 16:16:02 user nova-compute[70975]: Virtual Machine Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: hvm Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Nehalem Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: /dev/urandom Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: Apr 18 16:16:02 user nova-compute[70975]: {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 18 16:16:02 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Instance destroyed successfully. Apr 18 16:16:02 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] No BDM found with device name vda, not building metadata. {{(pid=70975) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] No BDM found with device name vdb, not building metadata. {{(pid=70975) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] No VIF found with MAC fa:16:3e:92:2d:7f, not building metadata {{(pid=70975) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:16:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6c592508-0444-4b42-a0b5-e3d8bd97f5ba/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:16:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6c592508-0444-4b42-a0b5-e3d8bd97f5ba/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:16:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6c592508-0444-4b42-a0b5-e3d8bd97f5ba/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:16:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6c592508-0444-4b42-a0b5-e3d8bd97f5ba/disk --force-share --output=json" returned: 0 in 0.153s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:16:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/aaac3797-349f-4695-bea2-8b0c022a66e0/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:16:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/aaac3797-349f-4695-bea2-8b0c022a66e0/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:16:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/aaac3797-349f-4695-bea2-8b0c022a66e0/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:16:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/aaac3797-349f-4695-bea2-8b0c022a66e0/disk --force-share --output=json" returned: 0 in 0.128s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:16:06 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:06 user nova-compute[70975]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:16:06 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] VM Stopped (Lifecycle Event) Apr 18 16:16:06 user nova-compute[70975]: DEBUG nova.compute.manager [req-a17dfb30-40ea-4dba-96a7-c9a25ce9601a req-4422a4a9-0dff-40e2-9bad-23f2add07de5 service nova] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Received event network-vif-plugged-e5d69d5c-1a5c-4300-ab15-e73f78388f0e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:16:06 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-a17dfb30-40ea-4dba-96a7-c9a25ce9601a req-4422a4a9-0dff-40e2-9bad-23f2add07de5 service nova] Acquiring lock "d7a293bf-a9bd-424e-ba11-bbed7dfea41c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:16:06 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-a17dfb30-40ea-4dba-96a7-c9a25ce9601a req-4422a4a9-0dff-40e2-9bad-23f2add07de5 service nova] Lock "d7a293bf-a9bd-424e-ba11-bbed7dfea41c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:16:06 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-a17dfb30-40ea-4dba-96a7-c9a25ce9601a req-4422a4a9-0dff-40e2-9bad-23f2add07de5 service nova] Lock "d7a293bf-a9bd-424e-ba11-bbed7dfea41c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:16:06 user nova-compute[70975]: DEBUG nova.compute.manager [req-a17dfb30-40ea-4dba-96a7-c9a25ce9601a req-4422a4a9-0dff-40e2-9bad-23f2add07de5 service nova] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] No waiting events found dispatching network-vif-plugged-e5d69d5c-1a5c-4300-ab15-e73f78388f0e {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:16:06 user nova-compute[70975]: WARNING nova.compute.manager [req-a17dfb30-40ea-4dba-96a7-c9a25ce9601a req-4422a4a9-0dff-40e2-9bad-23f2add07de5 service nova] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Received unexpected event network-vif-plugged-e5d69d5c-1a5c-4300-ab15-e73f78388f0e for instance with vm_state active and task_state rescuing. Apr 18 16:16:06 user nova-compute[70975]: DEBUG nova.compute.manager [req-2ef51212-ea4d-4f04-b95f-29b8f87a4aaf req-87e82064-9267-405f-977b-ac553167a6ca service nova] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Received event network-vif-plugged-e5d69d5c-1a5c-4300-ab15-e73f78388f0e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:16:06 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-2ef51212-ea4d-4f04-b95f-29b8f87a4aaf req-87e82064-9267-405f-977b-ac553167a6ca service nova] Acquiring lock "d7a293bf-a9bd-424e-ba11-bbed7dfea41c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:16:06 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-2ef51212-ea4d-4f04-b95f-29b8f87a4aaf req-87e82064-9267-405f-977b-ac553167a6ca service nova] Lock "d7a293bf-a9bd-424e-ba11-bbed7dfea41c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:16:06 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-2ef51212-ea4d-4f04-b95f-29b8f87a4aaf req-87e82064-9267-405f-977b-ac553167a6ca service nova] Lock "d7a293bf-a9bd-424e-ba11-bbed7dfea41c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:16:06 user nova-compute[70975]: DEBUG nova.compute.manager [req-2ef51212-ea4d-4f04-b95f-29b8f87a4aaf req-87e82064-9267-405f-977b-ac553167a6ca service nova] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] No waiting events found dispatching network-vif-plugged-e5d69d5c-1a5c-4300-ab15-e73f78388f0e {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:16:06 user nova-compute[70975]: WARNING nova.compute.manager [req-2ef51212-ea4d-4f04-b95f-29b8f87a4aaf req-87e82064-9267-405f-977b-ac553167a6ca service nova] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Received unexpected event network-vif-plugged-e5d69d5c-1a5c-4300-ab15-e73f78388f0e for instance with vm_state active and task_state rescuing. Apr 18 16:16:06 user nova-compute[70975]: DEBUG nova.compute.manager [req-2ef51212-ea4d-4f04-b95f-29b8f87a4aaf req-87e82064-9267-405f-977b-ac553167a6ca service nova] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Received event network-vif-plugged-e5d69d5c-1a5c-4300-ab15-e73f78388f0e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:16:06 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-2ef51212-ea4d-4f04-b95f-29b8f87a4aaf req-87e82064-9267-405f-977b-ac553167a6ca service nova] Acquiring lock "d7a293bf-a9bd-424e-ba11-bbed7dfea41c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:16:06 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-2ef51212-ea4d-4f04-b95f-29b8f87a4aaf req-87e82064-9267-405f-977b-ac553167a6ca service nova] Lock "d7a293bf-a9bd-424e-ba11-bbed7dfea41c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:16:06 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-2ef51212-ea4d-4f04-b95f-29b8f87a4aaf req-87e82064-9267-405f-977b-ac553167a6ca service nova] Lock "d7a293bf-a9bd-424e-ba11-bbed7dfea41c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:16:06 user nova-compute[70975]: DEBUG nova.compute.manager [req-2ef51212-ea4d-4f04-b95f-29b8f87a4aaf req-87e82064-9267-405f-977b-ac553167a6ca service nova] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] No waiting events found dispatching network-vif-plugged-e5d69d5c-1a5c-4300-ab15-e73f78388f0e {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:16:06 user nova-compute[70975]: WARNING nova.compute.manager [req-2ef51212-ea4d-4f04-b95f-29b8f87a4aaf req-87e82064-9267-405f-977b-ac553167a6ca service nova] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Received unexpected event network-vif-plugged-e5d69d5c-1a5c-4300-ab15-e73f78388f0e for instance with vm_state active and task_state rescuing. Apr 18 16:16:06 user nova-compute[70975]: DEBUG nova.virt.libvirt.host [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Removed pending event for d7a293bf-a9bd-424e-ba11-bbed7dfea41c due to event {{(pid=70975) _event_emit_delayed /opt/stack/nova/nova/virt/libvirt/host.py:438}} Apr 18 16:16:06 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Resumed> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:16:06 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] VM Resumed (Lifecycle Event) Apr 18 16:16:06 user nova-compute[70975]: DEBUG nova.compute.manager [None req-38b8facd-cf6d-492a-b045-6d36bd9abf1b tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:16:06 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:16:06 user nova-compute[70975]: DEBUG nova.compute.manager [None req-d05716ae-0577-4a67-9884-695d9efc6e12 None None] [instance: b9feb20a-78c0-44ac-ab87-3a68a14396aa] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:16:06 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:16:06 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:16:06 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:16:06 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Hypervisor/Node resource view: name=user free_ram=8229MB free_disk=26.582340240478516GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70975) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 18 16:16:06 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:16:06 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:16:06 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:16:06 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] During sync_power_state the instance has a pending task (rescuing). Skip. Apr 18 16:16:06 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Started> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:16:06 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] VM Started (Lifecycle Event) Apr 18 16:16:06 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:16:06 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:16:06 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance da82d905-1ca1-403d-9598-7561e69b9704 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:16:06 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 1b530349-680e-4def-86ef-29c340efa175 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:16:06 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance d7a293bf-a9bd-424e-ba11-bbed7dfea41c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:16:06 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 993d062c-8462-4534-bcde-9249779d4e90 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:16:06 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance aaac3797-349f-4695-bea2-8b0c022a66e0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:16:06 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 6c592508-0444-4b42-a0b5-e3d8bd97f5ba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:16:06 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:16:06 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 8aaa4e97-9439-4760-9e05-8b248b02074f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:16:06 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Total usable vcpus: 12, total allocated vcpus: 8 {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 18 16:16:06 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Final resource view: name=user phys_ram=16023MB used_ram=1536MB phys_disk=40GB used_disk=8GB total_vcpus=12 used_vcpus=8 pci_stats=[] {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 18 16:16:07 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:16:07 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:16:07 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Compute_service record updated for user:user {{(pid=70975) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 18 16:16:07 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.395s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:16:07 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:08 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:16:11 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:14 user nova-compute[70975]: DEBUG nova.compute.manager [req-b32e032e-096b-41dd-b8ec-586b105fbb19 req-ded49725-43e8-4029-816b-42bd916c823c service nova] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Received event network-changed-fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:16:14 user nova-compute[70975]: DEBUG nova.compute.manager [req-b32e032e-096b-41dd-b8ec-586b105fbb19 req-ded49725-43e8-4029-816b-42bd916c823c service nova] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Refreshing instance network info cache due to event network-changed-fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3. {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 18 16:16:14 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-b32e032e-096b-41dd-b8ec-586b105fbb19 req-ded49725-43e8-4029-816b-42bd916c823c service nova] Acquiring lock "refresh_cache-993d062c-8462-4534-bcde-9249779d4e90" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:16:14 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-b32e032e-096b-41dd-b8ec-586b105fbb19 req-ded49725-43e8-4029-816b-42bd916c823c service nova] Acquired lock "refresh_cache-993d062c-8462-4534-bcde-9249779d4e90" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:16:14 user nova-compute[70975]: DEBUG nova.network.neutron [req-b32e032e-096b-41dd-b8ec-586b105fbb19 req-ded49725-43e8-4029-816b-42bd916c823c service nova] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Refreshing network info cache for port fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3 {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 18 16:16:14 user nova-compute[70975]: DEBUG nova.network.neutron [req-b32e032e-096b-41dd-b8ec-586b105fbb19 req-ded49725-43e8-4029-816b-42bd916c823c service nova] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Updated VIF entry in instance network info cache for port fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3. {{(pid=70975) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 18 16:16:14 user nova-compute[70975]: DEBUG nova.network.neutron [req-b32e032e-096b-41dd-b8ec-586b105fbb19 req-ded49725-43e8-4029-816b-42bd916c823c service nova] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Updating instance_info_cache with network_info: [{"id": "fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3", "address": "fa:16:3e:d2:bc:43", "network": {"id": "02aca424-2923-404b-9c66-76bec89f82b7", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1255258227-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.61", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6b4e8d8797be4c0e91b1401538f2eba8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe3d4b7c-e1", "ovs_interfaceid": "fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:16:14 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-b32e032e-096b-41dd-b8ec-586b105fbb19 req-ded49725-43e8-4029-816b-42bd916c823c service nova] Releasing lock "refresh_cache-993d062c-8462-4534-bcde-9249779d4e90" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:16:16 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-8b90e197-0f0e-4b95-b9d7-b1d316aa1db1 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Acquiring lock "993d062c-8462-4534-bcde-9249779d4e90" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:16:16 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-8b90e197-0f0e-4b95-b9d7-b1d316aa1db1 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "993d062c-8462-4534-bcde-9249779d4e90" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:16:16 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-8b90e197-0f0e-4b95-b9d7-b1d316aa1db1 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Acquiring lock "993d062c-8462-4534-bcde-9249779d4e90-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:16:16 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-8b90e197-0f0e-4b95-b9d7-b1d316aa1db1 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "993d062c-8462-4534-bcde-9249779d4e90-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:16:16 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-8b90e197-0f0e-4b95-b9d7-b1d316aa1db1 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "993d062c-8462-4534-bcde-9249779d4e90-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:16:16 user nova-compute[70975]: INFO nova.compute.manager [None req-8b90e197-0f0e-4b95-b9d7-b1d316aa1db1 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Terminating instance Apr 18 16:16:16 user nova-compute[70975]: DEBUG nova.compute.manager [None req-8b90e197-0f0e-4b95-b9d7-b1d316aa1db1 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Start destroying the instance on the hypervisor. {{(pid=70975) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 18 16:16:16 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:16 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:16 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:16 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:16 user nova-compute[70975]: DEBUG nova.compute.manager [req-e3fdc046-5214-455a-bea0-6e7413ee4fae req-e7b06ff2-5316-4fdf-8343-9e9d41b5053d service nova] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Received event network-vif-unplugged-fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:16:16 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-e3fdc046-5214-455a-bea0-6e7413ee4fae req-e7b06ff2-5316-4fdf-8343-9e9d41b5053d service nova] Acquiring lock "993d062c-8462-4534-bcde-9249779d4e90-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:16:16 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-e3fdc046-5214-455a-bea0-6e7413ee4fae req-e7b06ff2-5316-4fdf-8343-9e9d41b5053d service nova] Lock "993d062c-8462-4534-bcde-9249779d4e90-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:16:16 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-e3fdc046-5214-455a-bea0-6e7413ee4fae req-e7b06ff2-5316-4fdf-8343-9e9d41b5053d service nova] Lock "993d062c-8462-4534-bcde-9249779d4e90-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:16:16 user nova-compute[70975]: DEBUG nova.compute.manager [req-e3fdc046-5214-455a-bea0-6e7413ee4fae req-e7b06ff2-5316-4fdf-8343-9e9d41b5053d service nova] [instance: 993d062c-8462-4534-bcde-9249779d4e90] No waiting events found dispatching network-vif-unplugged-fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:16:16 user nova-compute[70975]: DEBUG nova.compute.manager [req-e3fdc046-5214-455a-bea0-6e7413ee4fae req-e7b06ff2-5316-4fdf-8343-9e9d41b5053d service nova] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Received event network-vif-unplugged-fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3 for instance with task_state deleting. {{(pid=70975) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 18 16:16:16 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:16 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Instance destroyed successfully. Apr 18 16:16:16 user nova-compute[70975]: DEBUG nova.objects.instance [None req-8b90e197-0f0e-4b95-b9d7-b1d316aa1db1 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lazy-loading 'resources' on Instance uuid 993d062c-8462-4534-bcde-9249779d4e90 {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:16:16 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-8b90e197-0f0e-4b95-b9d7-b1d316aa1db1 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:14:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1278260269',display_name='tempest-AttachVolumeNegativeTest-server-1278260269',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-1278260269',id=5,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDO/CLWqaabu1PPSB6IO5u9ZPRsbyk1aJTiCtDZZM4ehxz6NX8dqpiUe00Z9Nr+BHXqhNNOtIquxOnLmyxJxZVKgMQccZdSmpkhgpRi7hndOMSE64mNrbe1QQ/t5OUkS2w==',key_name='tempest-keypair-87186417',keypairs=,launch_index=0,launched_at=2023-04-18T16:14:40Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='6b4e8d8797be4c0e91b1401538f2eba8',ramdisk_id='',reservation_id='r-5k1ey47l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeNegativeTest-216357456',owner_user_name='tempest-AttachVolumeNegativeTest-216357456-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-18T16:14:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='af90e17ec027463fa8793e8539c39e13',uuid=993d062c-8462-4534-bcde-9249779d4e90,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3", "address": "fa:16:3e:d2:bc:43", "network": {"id": "02aca424-2923-404b-9c66-76bec89f82b7", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1255258227-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.61", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6b4e8d8797be4c0e91b1401538f2eba8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe3d4b7c-e1", "ovs_interfaceid": "fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 18 16:16:16 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-8b90e197-0f0e-4b95-b9d7-b1d316aa1db1 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Converting VIF {"id": "fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3", "address": "fa:16:3e:d2:bc:43", "network": {"id": "02aca424-2923-404b-9c66-76bec89f82b7", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1255258227-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.61", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6b4e8d8797be4c0e91b1401538f2eba8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe3d4b7c-e1", "ovs_interfaceid": "fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:16:16 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-8b90e197-0f0e-4b95-b9d7-b1d316aa1db1 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d2:bc:43,bridge_name='br-int',has_traffic_filtering=True,id=fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3,network=Network(02aca424-2923-404b-9c66-76bec89f82b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe3d4b7c-e1') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:16:16 user nova-compute[70975]: DEBUG os_vif [None req-8b90e197-0f0e-4b95-b9d7-b1d316aa1db1 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d2:bc:43,bridge_name='br-int',has_traffic_filtering=True,id=fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3,network=Network(02aca424-2923-404b-9c66-76bec89f82b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe3d4b7c-e1') {{(pid=70975) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 18 16:16:16 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:16 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe3d4b7c-e1, bridge=br-int, if_exists=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:16:16 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:16:16 user nova-compute[70975]: INFO os_vif [None req-8b90e197-0f0e-4b95-b9d7-b1d316aa1db1 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d2:bc:43,bridge_name='br-int',has_traffic_filtering=True,id=fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3,network=Network(02aca424-2923-404b-9c66-76bec89f82b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe3d4b7c-e1') Apr 18 16:16:16 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-8b90e197-0f0e-4b95-b9d7-b1d316aa1db1 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Deleting instance files /opt/stack/data/nova/instances/993d062c-8462-4534-bcde-9249779d4e90_del Apr 18 16:16:16 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-8b90e197-0f0e-4b95-b9d7-b1d316aa1db1 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Deletion of /opt/stack/data/nova/instances/993d062c-8462-4534-bcde-9249779d4e90_del complete Apr 18 16:16:16 user nova-compute[70975]: INFO nova.compute.manager [None req-8b90e197-0f0e-4b95-b9d7-b1d316aa1db1 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Took 0.86 seconds to destroy the instance on the hypervisor. Apr 18 16:16:16 user nova-compute[70975]: DEBUG oslo.service.loopingcall [None req-8b90e197-0f0e-4b95-b9d7-b1d316aa1db1 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70975) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 18 16:16:16 user nova-compute[70975]: DEBUG nova.compute.manager [-] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Deallocating network for instance {{(pid=70975) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 18 16:16:16 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: 993d062c-8462-4534-bcde-9249779d4e90] deallocate_for_instance() {{(pid=70975) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 18 16:16:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:17 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Updating instance_info_cache with network_info: [] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:16:17 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Took 0.97 seconds to deallocate network for instance. Apr 18 16:16:17 user nova-compute[70975]: DEBUG nova.compute.manager [req-c039f7ec-32cd-4629-839a-8d1707f16da7 req-2856feb1-1bcb-4514-879f-62c0223a205d service nova] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Received event network-vif-deleted-fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:16:17 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-8b90e197-0f0e-4b95-b9d7-b1d316aa1db1 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:16:17 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-8b90e197-0f0e-4b95-b9d7-b1d316aa1db1 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:16:18 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-8b90e197-0f0e-4b95-b9d7-b1d316aa1db1 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:16:18 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-8b90e197-0f0e-4b95-b9d7-b1d316aa1db1 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:16:18 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-8b90e197-0f0e-4b95-b9d7-b1d316aa1db1 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.319s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:16:18 user nova-compute[70975]: INFO nova.scheduler.client.report [None req-8b90e197-0f0e-4b95-b9d7-b1d316aa1db1 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Deleted allocations for instance 993d062c-8462-4534-bcde-9249779d4e90 Apr 18 16:16:18 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-8b90e197-0f0e-4b95-b9d7-b1d316aa1db1 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "993d062c-8462-4534-bcde-9249779d4e90" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.331s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:16:18 user nova-compute[70975]: DEBUG nova.compute.manager [req-43868197-87ae-4a70-a724-1595b3243ca1 req-fcac0d06-ce96-4cb3-ab59-575ff366190b service nova] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Received event network-vif-plugged-fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:16:18 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-43868197-87ae-4a70-a724-1595b3243ca1 req-fcac0d06-ce96-4cb3-ab59-575ff366190b service nova] Acquiring lock "993d062c-8462-4534-bcde-9249779d4e90-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:16:18 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-43868197-87ae-4a70-a724-1595b3243ca1 req-fcac0d06-ce96-4cb3-ab59-575ff366190b service nova] Lock "993d062c-8462-4534-bcde-9249779d4e90-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:16:18 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-43868197-87ae-4a70-a724-1595b3243ca1 req-fcac0d06-ce96-4cb3-ab59-575ff366190b service nova] Lock "993d062c-8462-4534-bcde-9249779d4e90-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:16:18 user nova-compute[70975]: DEBUG nova.compute.manager [req-43868197-87ae-4a70-a724-1595b3243ca1 req-fcac0d06-ce96-4cb3-ab59-575ff366190b service nova] [instance: 993d062c-8462-4534-bcde-9249779d4e90] No waiting events found dispatching network-vif-plugged-fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:16:18 user nova-compute[70975]: WARNING nova.compute.manager [req-43868197-87ae-4a70-a724-1595b3243ca1 req-fcac0d06-ce96-4cb3-ab59-575ff366190b service nova] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Received unexpected event network-vif-plugged-fe3d4b7c-e19e-4931-8ac7-baad36fbf1b3 for instance with vm_state deleted and task_state None. Apr 18 16:16:21 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:22 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:26 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:27 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:27 user nova-compute[70975]: DEBUG nova.compute.manager [req-f1c8f9cf-5018-4c99-b64f-8cda2dc39553 req-86f01186-7c3c-4af1-8581-8a876eca63e4 service nova] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Received event network-changed-f2d5008c-284e-45a5-b349-4fe0723e138e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:16:27 user nova-compute[70975]: DEBUG nova.compute.manager [req-f1c8f9cf-5018-4c99-b64f-8cda2dc39553 req-86f01186-7c3c-4af1-8581-8a876eca63e4 service nova] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Refreshing instance network info cache due to event network-changed-f2d5008c-284e-45a5-b349-4fe0723e138e. {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 18 16:16:27 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-f1c8f9cf-5018-4c99-b64f-8cda2dc39553 req-86f01186-7c3c-4af1-8581-8a876eca63e4 service nova] Acquiring lock "refresh_cache-aaac3797-349f-4695-bea2-8b0c022a66e0" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:16:27 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-f1c8f9cf-5018-4c99-b64f-8cda2dc39553 req-86f01186-7c3c-4af1-8581-8a876eca63e4 service nova] Acquired lock "refresh_cache-aaac3797-349f-4695-bea2-8b0c022a66e0" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:16:27 user nova-compute[70975]: DEBUG nova.network.neutron [req-f1c8f9cf-5018-4c99-b64f-8cda2dc39553 req-86f01186-7c3c-4af1-8581-8a876eca63e4 service nova] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Refreshing network info cache for port f2d5008c-284e-45a5-b349-4fe0723e138e {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 18 16:16:28 user nova-compute[70975]: DEBUG nova.network.neutron [req-f1c8f9cf-5018-4c99-b64f-8cda2dc39553 req-86f01186-7c3c-4af1-8581-8a876eca63e4 service nova] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Updated VIF entry in instance network info cache for port f2d5008c-284e-45a5-b349-4fe0723e138e. {{(pid=70975) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 18 16:16:28 user nova-compute[70975]: DEBUG nova.network.neutron [req-f1c8f9cf-5018-4c99-b64f-8cda2dc39553 req-86f01186-7c3c-4af1-8581-8a876eca63e4 service nova] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Updating instance_info_cache with network_info: [{"id": "f2d5008c-284e-45a5-b349-4fe0723e138e", "address": "fa:16:3e:76:88:8e", "network": {"id": "a8157d06-a7f6-4b9c-ae66-cd40da31eb6d", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1365454803-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.31", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8b357cc820a04f3486f98d8e38c1a3d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf2d5008c-28", "ovs_interfaceid": "f2d5008c-284e-45a5-b349-4fe0723e138e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:16:28 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-f1c8f9cf-5018-4c99-b64f-8cda2dc39553 req-86f01186-7c3c-4af1-8581-8a876eca63e4 service nova] Releasing lock "refresh_cache-aaac3797-349f-4695-bea2-8b0c022a66e0" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:16:29 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-a0244e35-2c8c-4fae-a039-7a7095d5cd9e tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Acquiring lock "aaac3797-349f-4695-bea2-8b0c022a66e0" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:16:29 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-a0244e35-2c8c-4fae-a039-7a7095d5cd9e tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Lock "aaac3797-349f-4695-bea2-8b0c022a66e0" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:16:29 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-a0244e35-2c8c-4fae-a039-7a7095d5cd9e tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Acquiring lock "aaac3797-349f-4695-bea2-8b0c022a66e0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:16:29 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-a0244e35-2c8c-4fae-a039-7a7095d5cd9e tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Lock "aaac3797-349f-4695-bea2-8b0c022a66e0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:16:29 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-a0244e35-2c8c-4fae-a039-7a7095d5cd9e tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Lock "aaac3797-349f-4695-bea2-8b0c022a66e0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:16:29 user nova-compute[70975]: INFO nova.compute.manager [None req-a0244e35-2c8c-4fae-a039-7a7095d5cd9e tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Terminating instance Apr 18 16:16:29 user nova-compute[70975]: DEBUG nova.compute.manager [None req-a0244e35-2c8c-4fae-a039-7a7095d5cd9e tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Start destroying the instance on the hypervisor. {{(pid=70975) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 18 16:16:29 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:29 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:29 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:29 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:29 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:29 user nova-compute[70975]: DEBUG nova.compute.manager [req-c926647e-5217-4b5b-9b17-fa331241780e req-5951b17a-98ed-4a16-a057-da6d05c9e7a8 service nova] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Received event network-vif-unplugged-f2d5008c-284e-45a5-b349-4fe0723e138e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:16:29 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-c926647e-5217-4b5b-9b17-fa331241780e req-5951b17a-98ed-4a16-a057-da6d05c9e7a8 service nova] Acquiring lock "aaac3797-349f-4695-bea2-8b0c022a66e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:16:29 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-c926647e-5217-4b5b-9b17-fa331241780e req-5951b17a-98ed-4a16-a057-da6d05c9e7a8 service nova] Lock "aaac3797-349f-4695-bea2-8b0c022a66e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:16:29 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-c926647e-5217-4b5b-9b17-fa331241780e req-5951b17a-98ed-4a16-a057-da6d05c9e7a8 service nova] Lock "aaac3797-349f-4695-bea2-8b0c022a66e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:16:29 user nova-compute[70975]: DEBUG nova.compute.manager [req-c926647e-5217-4b5b-9b17-fa331241780e req-5951b17a-98ed-4a16-a057-da6d05c9e7a8 service nova] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] No waiting events found dispatching network-vif-unplugged-f2d5008c-284e-45a5-b349-4fe0723e138e {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:16:29 user nova-compute[70975]: DEBUG nova.compute.manager [req-c926647e-5217-4b5b-9b17-fa331241780e req-5951b17a-98ed-4a16-a057-da6d05c9e7a8 service nova] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Received event network-vif-unplugged-f2d5008c-284e-45a5-b349-4fe0723e138e for instance with task_state deleting. {{(pid=70975) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 18 16:16:29 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Instance destroyed successfully. Apr 18 16:16:29 user nova-compute[70975]: DEBUG nova.objects.instance [None req-a0244e35-2c8c-4fae-a039-7a7095d5cd9e tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Lazy-loading 'resources' on Instance uuid aaac3797-349f-4695-bea2-8b0c022a66e0 {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:16:29 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-a0244e35-2c8c-4fae-a039-7a7095d5cd9e tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2023-04-18T16:14:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-182924015',display_name='tempest-AttachSCSIVolumeTestJSON-server-182924015',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachscsivolumetestjson-server-182924015',id=6,image_ref='2824808e-fd92-429e-ad00-18522a9ee7be',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBONN6MRKRvMlP+yhMvoK61g7j3Fx5jQQ3LJcp2/nEL6Bw4QbghpzmZf4ISq5JqxxU2idrMJX8n+LTtOiGMui6w8KD50cE/dbEKkPZMqCi7adfHQOEiIsLKutmIbwwhmkFw==',key_name='tempest-keypair-566143322',keypairs=,launch_index=0,launched_at=2023-04-18T16:14:47Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='8b357cc820a04f3486f98d8e38c1a3d6',ramdisk_id='',reservation_id='r-mxkh2z6f',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2824808e-fd92-429e-ad00-18522a9ee7be',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_scsi_model='virtio-scsi',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachSCSIVolumeTestJSON-344223138',owner_user_name='tempest-AttachSCSIVolumeTestJSON-344223138-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-18T16:14:48Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ffaa9df682cb40739d1d754000e04743',uuid=aaac3797-349f-4695-bea2-8b0c022a66e0,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f2d5008c-284e-45a5-b349-4fe0723e138e", "address": "fa:16:3e:76:88:8e", "network": {"id": "a8157d06-a7f6-4b9c-ae66-cd40da31eb6d", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1365454803-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.31", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8b357cc820a04f3486f98d8e38c1a3d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf2d5008c-28", "ovs_interfaceid": "f2d5008c-284e-45a5-b349-4fe0723e138e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 18 16:16:29 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-a0244e35-2c8c-4fae-a039-7a7095d5cd9e tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Converting VIF {"id": "f2d5008c-284e-45a5-b349-4fe0723e138e", "address": "fa:16:3e:76:88:8e", "network": {"id": "a8157d06-a7f6-4b9c-ae66-cd40da31eb6d", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1365454803-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.31", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8b357cc820a04f3486f98d8e38c1a3d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf2d5008c-28", "ovs_interfaceid": "f2d5008c-284e-45a5-b349-4fe0723e138e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:16:29 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-a0244e35-2c8c-4fae-a039-7a7095d5cd9e tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:76:88:8e,bridge_name='br-int',has_traffic_filtering=True,id=f2d5008c-284e-45a5-b349-4fe0723e138e,network=Network(a8157d06-a7f6-4b9c-ae66-cd40da31eb6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf2d5008c-28') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:16:29 user nova-compute[70975]: DEBUG os_vif [None req-a0244e35-2c8c-4fae-a039-7a7095d5cd9e tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:76:88:8e,bridge_name='br-int',has_traffic_filtering=True,id=f2d5008c-284e-45a5-b349-4fe0723e138e,network=Network(a8157d06-a7f6-4b9c-ae66-cd40da31eb6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf2d5008c-28') {{(pid=70975) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 18 16:16:29 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:29 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf2d5008c-28, bridge=br-int, if_exists=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:16:29 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:29 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:16:29 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:29 user nova-compute[70975]: INFO os_vif [None req-a0244e35-2c8c-4fae-a039-7a7095d5cd9e tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:76:88:8e,bridge_name='br-int',has_traffic_filtering=True,id=f2d5008c-284e-45a5-b349-4fe0723e138e,network=Network(a8157d06-a7f6-4b9c-ae66-cd40da31eb6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf2d5008c-28') Apr 18 16:16:29 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-a0244e35-2c8c-4fae-a039-7a7095d5cd9e tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Deleting instance files /opt/stack/data/nova/instances/aaac3797-349f-4695-bea2-8b0c022a66e0_del Apr 18 16:16:29 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-a0244e35-2c8c-4fae-a039-7a7095d5cd9e tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Deletion of /opt/stack/data/nova/instances/aaac3797-349f-4695-bea2-8b0c022a66e0_del complete Apr 18 16:16:29 user nova-compute[70975]: INFO nova.compute.manager [None req-a0244e35-2c8c-4fae-a039-7a7095d5cd9e tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Took 0.65 seconds to destroy the instance on the hypervisor. Apr 18 16:16:29 user nova-compute[70975]: DEBUG oslo.service.loopingcall [None req-a0244e35-2c8c-4fae-a039-7a7095d5cd9e tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70975) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 18 16:16:29 user nova-compute[70975]: DEBUG nova.compute.manager [-] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Deallocating network for instance {{(pid=70975) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 18 16:16:29 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] deallocate_for_instance() {{(pid=70975) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 18 16:16:31 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Updating instance_info_cache with network_info: [] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:16:31 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Took 1.10 seconds to deallocate network for instance. Apr 18 16:16:31 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-a0244e35-2c8c-4fae-a039-7a7095d5cd9e tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:16:31 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-a0244e35-2c8c-4fae-a039-7a7095d5cd9e tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:16:31 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-a0244e35-2c8c-4fae-a039-7a7095d5cd9e tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:16:31 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-a0244e35-2c8c-4fae-a039-7a7095d5cd9e tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:16:31 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-a0244e35-2c8c-4fae-a039-7a7095d5cd9e tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.255s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:16:31 user nova-compute[70975]: INFO nova.scheduler.client.report [None req-a0244e35-2c8c-4fae-a039-7a7095d5cd9e tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Deleted allocations for instance aaac3797-349f-4695-bea2-8b0c022a66e0 Apr 18 16:16:31 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-a0244e35-2c8c-4fae-a039-7a7095d5cd9e tempest-AttachSCSIVolumeTestJSON-344223138 tempest-AttachSCSIVolumeTestJSON-344223138-project-member] Lock "aaac3797-349f-4695-bea2-8b0c022a66e0" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.220s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:16:31 user nova-compute[70975]: DEBUG nova.compute.manager [req-1c14db23-d5d1-45ef-8bf5-3111f0e3eeb9 req-3b2866d4-258a-49fc-bc53-da9c17345050 service nova] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Received event network-vif-plugged-f2d5008c-284e-45a5-b349-4fe0723e138e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:16:31 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-1c14db23-d5d1-45ef-8bf5-3111f0e3eeb9 req-3b2866d4-258a-49fc-bc53-da9c17345050 service nova] Acquiring lock "aaac3797-349f-4695-bea2-8b0c022a66e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:16:31 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-1c14db23-d5d1-45ef-8bf5-3111f0e3eeb9 req-3b2866d4-258a-49fc-bc53-da9c17345050 service nova] Lock "aaac3797-349f-4695-bea2-8b0c022a66e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:16:31 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-1c14db23-d5d1-45ef-8bf5-3111f0e3eeb9 req-3b2866d4-258a-49fc-bc53-da9c17345050 service nova] Lock "aaac3797-349f-4695-bea2-8b0c022a66e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:16:31 user nova-compute[70975]: DEBUG nova.compute.manager [req-1c14db23-d5d1-45ef-8bf5-3111f0e3eeb9 req-3b2866d4-258a-49fc-bc53-da9c17345050 service nova] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] No waiting events found dispatching network-vif-plugged-f2d5008c-284e-45a5-b349-4fe0723e138e {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:16:31 user nova-compute[70975]: WARNING nova.compute.manager [req-1c14db23-d5d1-45ef-8bf5-3111f0e3eeb9 req-3b2866d4-258a-49fc-bc53-da9c17345050 service nova] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Received unexpected event network-vif-plugged-f2d5008c-284e-45a5-b349-4fe0723e138e for instance with vm_state deleted and task_state None. Apr 18 16:16:31 user nova-compute[70975]: DEBUG nova.compute.manager [req-1c14db23-d5d1-45ef-8bf5-3111f0e3eeb9 req-3b2866d4-258a-49fc-bc53-da9c17345050 service nova] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Received event network-vif-deleted-f2d5008c-284e-45a5-b349-4fe0723e138e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:16:31 user nova-compute[70975]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:16:31 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: 993d062c-8462-4534-bcde-9249779d4e90] VM Stopped (Lifecycle Event) Apr 18 16:16:31 user nova-compute[70975]: DEBUG nova.compute.manager [None req-306c3cb0-e218-4876-960f-3a9312176c2f None None] [instance: 993d062c-8462-4534-bcde-9249779d4e90] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:16:32 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:34 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:35 user nova-compute[70975]: DEBUG nova.compute.manager [req-9bd79f78-5aa9-4f24-9ce3-778c3db452c3 req-76c3b475-8a9a-4551-ae93-7a118d6547e8 service nova] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Received event network-changed-395afd81-e898-47ee-a928-eaab584d5b4e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:16:35 user nova-compute[70975]: DEBUG nova.compute.manager [req-9bd79f78-5aa9-4f24-9ce3-778c3db452c3 req-76c3b475-8a9a-4551-ae93-7a118d6547e8 service nova] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Refreshing instance network info cache due to event network-changed-395afd81-e898-47ee-a928-eaab584d5b4e. {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 18 16:16:35 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-9bd79f78-5aa9-4f24-9ce3-778c3db452c3 req-76c3b475-8a9a-4551-ae93-7a118d6547e8 service nova] Acquiring lock "refresh_cache-6c592508-0444-4b42-a0b5-e3d8bd97f5ba" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:16:35 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-9bd79f78-5aa9-4f24-9ce3-778c3db452c3 req-76c3b475-8a9a-4551-ae93-7a118d6547e8 service nova] Acquired lock "refresh_cache-6c592508-0444-4b42-a0b5-e3d8bd97f5ba" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:16:35 user nova-compute[70975]: DEBUG nova.network.neutron [req-9bd79f78-5aa9-4f24-9ce3-778c3db452c3 req-76c3b475-8a9a-4551-ae93-7a118d6547e8 service nova] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Refreshing network info cache for port 395afd81-e898-47ee-a928-eaab584d5b4e {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 18 16:16:35 user nova-compute[70975]: DEBUG nova.network.neutron [req-9bd79f78-5aa9-4f24-9ce3-778c3db452c3 req-76c3b475-8a9a-4551-ae93-7a118d6547e8 service nova] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Updated VIF entry in instance network info cache for port 395afd81-e898-47ee-a928-eaab584d5b4e. {{(pid=70975) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 18 16:16:35 user nova-compute[70975]: DEBUG nova.network.neutron [req-9bd79f78-5aa9-4f24-9ce3-778c3db452c3 req-76c3b475-8a9a-4551-ae93-7a118d6547e8 service nova] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Updating instance_info_cache with network_info: [{"id": "395afd81-e898-47ee-a928-eaab584d5b4e", "address": "fa:16:3e:fa:1c:ad", "network": {"id": "0fafa4bf-5d8d-43f9-8cc3-ae6e7c58812c", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-891115046-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.120", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8edf93a24e754e1ea58c0a7fd4f553dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap395afd81-e8", "ovs_interfaceid": "395afd81-e898-47ee-a928-eaab584d5b4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:16:35 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-9bd79f78-5aa9-4f24-9ce3-778c3db452c3 req-76c3b475-8a9a-4551-ae93-7a118d6547e8 service nova] Releasing lock "refresh_cache-6c592508-0444-4b42-a0b5-e3d8bd97f5ba" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:16:37 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:37 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Acquiring lock "0ad9c135-f279-4bd8-982d-65b45242adcf" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:16:37 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lock "0ad9c135-f279-4bd8-982d-65b45242adcf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:16:37 user nova-compute[70975]: DEBUG nova.compute.manager [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Starting instance... {{(pid=70975) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 18 16:16:37 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:16:37 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:16:37 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70975) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 18 16:16:37 user nova-compute[70975]: INFO nova.compute.claims [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Claim successful on node user Apr 18 16:16:38 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:16:38 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:16:38 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.336s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:16:38 user nova-compute[70975]: DEBUG nova.compute.manager [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Start building networks asynchronously for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 18 16:16:38 user nova-compute[70975]: DEBUG nova.compute.manager [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Allocating IP information in the background. {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 18 16:16:38 user nova-compute[70975]: DEBUG nova.network.neutron [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] allocate_for_instance() {{(pid=70975) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 18 16:16:38 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 18 16:16:38 user nova-compute[70975]: DEBUG nova.compute.manager [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Start building block device mappings for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 18 16:16:38 user nova-compute[70975]: DEBUG nova.policy [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '299ba2e202244f59a09e22df9ea8efe7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8edf93a24e754e1ea58c0a7fd4f553dc', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70975) authorize /opt/stack/nova/nova/policy.py:203}} Apr 18 16:16:38 user nova-compute[70975]: DEBUG nova.compute.manager [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Start spawning the instance on the hypervisor. {{(pid=70975) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 18 16:16:38 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Creating instance directory {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 18 16:16:38 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Creating image(s) Apr 18 16:16:38 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Acquiring lock "/opt/stack/data/nova/instances/0ad9c135-f279-4bd8-982d-65b45242adcf/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:16:38 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lock "/opt/stack/data/nova/instances/0ad9c135-f279-4bd8-982d-65b45242adcf/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:16:38 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lock "/opt/stack/data/nova/instances/0ad9c135-f279-4bd8-982d-65b45242adcf/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:16:38 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:16:38 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.133s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:16:38 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Acquiring lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:16:38 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:16:38 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:16:38 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.129s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:16:38 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/0ad9c135-f279-4bd8-982d-65b45242adcf/disk 1073741824 {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:16:38 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/0ad9c135-f279-4bd8-982d-65b45242adcf/disk 1073741824" returned: 0 in 0.046s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:16:38 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.182s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:16:38 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:16:38 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.157s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:16:38 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Checking if we can resize image /opt/stack/data/nova/instances/0ad9c135-f279-4bd8-982d-65b45242adcf/disk. size=1073741824 {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 18 16:16:38 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0ad9c135-f279-4bd8-982d-65b45242adcf/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:16:38 user nova-compute[70975]: DEBUG nova.network.neutron [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Successfully created port: 21586886-79a5-4cab-bcfe-b52b65fbf177 {{(pid=70975) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 18 16:16:38 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0ad9c135-f279-4bd8-982d-65b45242adcf/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:16:38 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Cannot resize image /opt/stack/data/nova/instances/0ad9c135-f279-4bd8-982d-65b45242adcf/disk to a smaller size. {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 18 16:16:38 user nova-compute[70975]: DEBUG nova.objects.instance [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lazy-loading 'migration_context' on Instance uuid 0ad9c135-f279-4bd8-982d-65b45242adcf {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:16:38 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Created local disks {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 18 16:16:38 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Ensure instance console log exists: /opt/stack/data/nova/instances/0ad9c135-f279-4bd8-982d-65b45242adcf/console.log {{(pid=70975) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 18 16:16:38 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:16:38 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:16:38 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG nova.network.neutron [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Successfully updated port: 21586886-79a5-4cab-bcfe-b52b65fbf177 {{(pid=70975) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Acquiring lock "refresh_cache-0ad9c135-f279-4bd8-982d-65b45242adcf" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Acquired lock "refresh_cache-0ad9c135-f279-4bd8-982d-65b45242adcf" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG nova.network.neutron [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Building network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG nova.compute.manager [req-54dbddf9-bd9b-4042-9a26-f99abae0bdfb req-e2d6156a-2256-45a1-8092-44f06fb049a3 service nova] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Received event network-changed-21586886-79a5-4cab-bcfe-b52b65fbf177 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG nova.compute.manager [req-54dbddf9-bd9b-4042-9a26-f99abae0bdfb req-e2d6156a-2256-45a1-8092-44f06fb049a3 service nova] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Refreshing instance network info cache due to event network-changed-21586886-79a5-4cab-bcfe-b52b65fbf177. {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-54dbddf9-bd9b-4042-9a26-f99abae0bdfb req-e2d6156a-2256-45a1-8092-44f06fb049a3 service nova] Acquiring lock "refresh_cache-0ad9c135-f279-4bd8-982d-65b45242adcf" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG nova.network.neutron [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Instance cache missing network info. {{(pid=70975) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG nova.network.neutron [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Updating instance_info_cache with network_info: [{"id": "21586886-79a5-4cab-bcfe-b52b65fbf177", "address": "fa:16:3e:b1:99:de", "network": {"id": "0fafa4bf-5d8d-43f9-8cc3-ae6e7c58812c", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-891115046-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8edf93a24e754e1ea58c0a7fd4f553dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap21586886-79", "ovs_interfaceid": "21586886-79a5-4cab-bcfe-b52b65fbf177", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Releasing lock "refresh_cache-0ad9c135-f279-4bd8-982d-65b45242adcf" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG nova.compute.manager [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Instance network_info: |[{"id": "21586886-79a5-4cab-bcfe-b52b65fbf177", "address": "fa:16:3e:b1:99:de", "network": {"id": "0fafa4bf-5d8d-43f9-8cc3-ae6e7c58812c", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-891115046-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8edf93a24e754e1ea58c0a7fd4f553dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap21586886-79", "ovs_interfaceid": "21586886-79a5-4cab-bcfe-b52b65fbf177", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-54dbddf9-bd9b-4042-9a26-f99abae0bdfb req-e2d6156a-2256-45a1-8092-44f06fb049a3 service nova] Acquired lock "refresh_cache-0ad9c135-f279-4bd8-982d-65b45242adcf" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG nova.network.neutron [req-54dbddf9-bd9b-4042-9a26-f99abae0bdfb req-e2d6156a-2256-45a1-8092-44f06fb049a3 service nova] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Refreshing network info cache for port 21586886-79a5-4cab-bcfe-b52b65fbf177 {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Start _get_guest_xml network_info=[{"id": "21586886-79a5-4cab-bcfe-b52b65fbf177", "address": "fa:16:3e:b1:99:de", "network": {"id": "0fafa4bf-5d8d-43f9-8cc3-ae6e7c58812c", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-891115046-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8edf93a24e754e1ea58c0a7fd4f553dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap21586886-79", "ovs_interfaceid": "21586886-79a5-4cab-bcfe-b52b65fbf177", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encrypted': False, 'device_type': 'disk', 'encryption_secret_uuid': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'b11a20de-f82a-4158-b53e-0a0c7a1552cb'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 18 16:16:39 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:16:39 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:16:39 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70975) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-18T16:11:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=), allow threads: True {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Flavor limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Image limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Flavor pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Image pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Got 1 possible topologies {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:16:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-995654785',display_name='tempest-VolumesAdminNegativeTest-server-995654785',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-995654785',id=10,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8edf93a24e754e1ea58c0a7fd4f553dc',ramdisk_id='',reservation_id='r-3vh920j1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-2015888259',owner_user_name='tempest-VolumesAdminNegativeTest-2015888259-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:16:38Z,user_data=None,user_id='299ba2e202244f59a09e22df9ea8efe7',uuid=0ad9c135-f279-4bd8-982d-65b45242adcf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "21586886-79a5-4cab-bcfe-b52b65fbf177", "address": "fa:16:3e:b1:99:de", "network": {"id": "0fafa4bf-5d8d-43f9-8cc3-ae6e7c58812c", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-891115046-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8edf93a24e754e1ea58c0a7fd4f553dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap21586886-79", "ovs_interfaceid": "21586886-79a5-4cab-bcfe-b52b65fbf177", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70975) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Converting VIF {"id": "21586886-79a5-4cab-bcfe-b52b65fbf177", "address": "fa:16:3e:b1:99:de", "network": {"id": "0fafa4bf-5d8d-43f9-8cc3-ae6e7c58812c", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-891115046-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8edf93a24e754e1ea58c0a7fd4f553dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap21586886-79", "ovs_interfaceid": "21586886-79a5-4cab-bcfe-b52b65fbf177", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:99:de,bridge_name='br-int',has_traffic_filtering=True,id=21586886-79a5-4cab-bcfe-b52b65fbf177,network=Network(0fafa4bf-5d8d-43f9-8cc3-ae6e7c58812c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21586886-79') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG nova.objects.instance [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lazy-loading 'pci_devices' on Instance uuid 0ad9c135-f279-4bd8-982d-65b45242adcf {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] End _get_guest_xml xml= Apr 18 16:16:39 user nova-compute[70975]: 0ad9c135-f279-4bd8-982d-65b45242adcf Apr 18 16:16:39 user nova-compute[70975]: instance-0000000a Apr 18 16:16:39 user nova-compute[70975]: 131072 Apr 18 16:16:39 user nova-compute[70975]: 1 Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: tempest-VolumesAdminNegativeTest-server-995654785 Apr 18 16:16:39 user nova-compute[70975]: 2023-04-18 16:16:39 Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: 128 Apr 18 16:16:39 user nova-compute[70975]: 1 Apr 18 16:16:39 user nova-compute[70975]: 0 Apr 18 16:16:39 user nova-compute[70975]: 0 Apr 18 16:16:39 user nova-compute[70975]: 1 Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: tempest-VolumesAdminNegativeTest-2015888259-project-member Apr 18 16:16:39 user nova-compute[70975]: tempest-VolumesAdminNegativeTest-2015888259 Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: OpenStack Foundation Apr 18 16:16:39 user nova-compute[70975]: OpenStack Nova Apr 18 16:16:39 user nova-compute[70975]: 0.0.0 Apr 18 16:16:39 user nova-compute[70975]: 0ad9c135-f279-4bd8-982d-65b45242adcf Apr 18 16:16:39 user nova-compute[70975]: 0ad9c135-f279-4bd8-982d-65b45242adcf Apr 18 16:16:39 user nova-compute[70975]: Virtual Machine Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: hvm Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Nehalem Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: /dev/urandom Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: Apr 18 16:16:39 user nova-compute[70975]: {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:16:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-995654785',display_name='tempest-VolumesAdminNegativeTest-server-995654785',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-995654785',id=10,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8edf93a24e754e1ea58c0a7fd4f553dc',ramdisk_id='',reservation_id='r-3vh920j1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-2015888259',owner_user_name='tempest-VolumesAdminNegativeTest-2015888259-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:16:38Z,user_data=None,user_id='299ba2e202244f59a09e22df9ea8efe7',uuid=0ad9c135-f279-4bd8-982d-65b45242adcf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "21586886-79a5-4cab-bcfe-b52b65fbf177", "address": "fa:16:3e:b1:99:de", "network": {"id": "0fafa4bf-5d8d-43f9-8cc3-ae6e7c58812c", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-891115046-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8edf93a24e754e1ea58c0a7fd4f553dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap21586886-79", "ovs_interfaceid": "21586886-79a5-4cab-bcfe-b52b65fbf177", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Converting VIF {"id": "21586886-79a5-4cab-bcfe-b52b65fbf177", "address": "fa:16:3e:b1:99:de", "network": {"id": "0fafa4bf-5d8d-43f9-8cc3-ae6e7c58812c", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-891115046-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8edf93a24e754e1ea58c0a7fd4f553dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap21586886-79", "ovs_interfaceid": "21586886-79a5-4cab-bcfe-b52b65fbf177", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:99:de,bridge_name='br-int',has_traffic_filtering=True,id=21586886-79a5-4cab-bcfe-b52b65fbf177,network=Network(0fafa4bf-5d8d-43f9-8cc3-ae6e7c58812c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21586886-79') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG os_vif [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:99:de,bridge_name='br-int',has_traffic_filtering=True,id=21586886-79a5-4cab-bcfe-b52b65fbf177,network=Network(0fafa4bf-5d8d-43f9-8cc3-ae6e7c58812c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21586886-79') {{(pid=70975) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:16:39 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 18 16:16:40 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:40 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap21586886-79, may_exist=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:16:40 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap21586886-79, col_values=(('external_ids', {'iface-id': '21586886-79a5-4cab-bcfe-b52b65fbf177', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b1:99:de', 'vm-uuid': '0ad9c135-f279-4bd8-982d-65b45242adcf'}),)) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:16:40 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:40 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:16:40 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:40 user nova-compute[70975]: INFO os_vif [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:99:de,bridge_name='br-int',has_traffic_filtering=True,id=21586886-79a5-4cab-bcfe-b52b65fbf177,network=Network(0fafa4bf-5d8d-43f9-8cc3-ae6e7c58812c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21586886-79') Apr 18 16:16:40 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] No BDM found with device name vda, not building metadata. {{(pid=70975) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 18 16:16:40 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] No VIF found with MAC fa:16:3e:b1:99:de, not building metadata {{(pid=70975) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 18 16:16:40 user nova-compute[70975]: DEBUG nova.network.neutron [req-54dbddf9-bd9b-4042-9a26-f99abae0bdfb req-e2d6156a-2256-45a1-8092-44f06fb049a3 service nova] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Updated VIF entry in instance network info cache for port 21586886-79a5-4cab-bcfe-b52b65fbf177. {{(pid=70975) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 18 16:16:40 user nova-compute[70975]: DEBUG nova.network.neutron [req-54dbddf9-bd9b-4042-9a26-f99abae0bdfb req-e2d6156a-2256-45a1-8092-44f06fb049a3 service nova] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Updating instance_info_cache with network_info: [{"id": "21586886-79a5-4cab-bcfe-b52b65fbf177", "address": "fa:16:3e:b1:99:de", "network": {"id": "0fafa4bf-5d8d-43f9-8cc3-ae6e7c58812c", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-891115046-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8edf93a24e754e1ea58c0a7fd4f553dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap21586886-79", "ovs_interfaceid": "21586886-79a5-4cab-bcfe-b52b65fbf177", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:16:40 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-54dbddf9-bd9b-4042-9a26-f99abae0bdfb req-e2d6156a-2256-45a1-8092-44f06fb049a3 service nova] Releasing lock "refresh_cache-0ad9c135-f279-4bd8-982d-65b45242adcf" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:16:40 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:41 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:41 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:41 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:41 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:41 user nova-compute[70975]: DEBUG nova.compute.manager [req-cf77cf3b-b342-4718-b7b1-bee9bf69ba00 req-d37bbf4b-6a63-442c-963a-43f575d3d7c2 service nova] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Received event network-vif-plugged-21586886-79a5-4cab-bcfe-b52b65fbf177 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:16:41 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-cf77cf3b-b342-4718-b7b1-bee9bf69ba00 req-d37bbf4b-6a63-442c-963a-43f575d3d7c2 service nova] Acquiring lock "0ad9c135-f279-4bd8-982d-65b45242adcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:16:41 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-cf77cf3b-b342-4718-b7b1-bee9bf69ba00 req-d37bbf4b-6a63-442c-963a-43f575d3d7c2 service nova] Lock "0ad9c135-f279-4bd8-982d-65b45242adcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:16:41 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-cf77cf3b-b342-4718-b7b1-bee9bf69ba00 req-d37bbf4b-6a63-442c-963a-43f575d3d7c2 service nova] Lock "0ad9c135-f279-4bd8-982d-65b45242adcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:16:41 user nova-compute[70975]: DEBUG nova.compute.manager [req-cf77cf3b-b342-4718-b7b1-bee9bf69ba00 req-d37bbf4b-6a63-442c-963a-43f575d3d7c2 service nova] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] No waiting events found dispatching network-vif-plugged-21586886-79a5-4cab-bcfe-b52b65fbf177 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:16:41 user nova-compute[70975]: WARNING nova.compute.manager [req-cf77cf3b-b342-4718-b7b1-bee9bf69ba00 req-d37bbf4b-6a63-442c-963a-43f575d3d7c2 service nova] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Received unexpected event network-vif-plugged-21586886-79a5-4cab-bcfe-b52b65fbf177 for instance with vm_state building and task_state spawning. Apr 18 16:16:42 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:43 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Resumed> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:16:43 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] VM Resumed (Lifecycle Event) Apr 18 16:16:43 user nova-compute[70975]: DEBUG nova.compute.manager [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Instance event wait completed in 0 seconds for {{(pid=70975) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 18 16:16:43 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Guest created on hypervisor {{(pid=70975) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 18 16:16:43 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Instance spawned successfully. Apr 18 16:16:43 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 18 16:16:43 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:16:43 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:16:43 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Found default for hw_cdrom_bus of ide {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:16:43 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Found default for hw_disk_bus of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:16:43 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Found default for hw_input_bus of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:16:43 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Found default for hw_pointer_model of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:16:43 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Found default for hw_video_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:16:43 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Found default for hw_vif_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:16:43 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:16:43 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Started> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:16:43 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] VM Started (Lifecycle Event) Apr 18 16:16:43 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:16:43 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:16:43 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:16:43 user nova-compute[70975]: INFO nova.compute.manager [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Took 5.24 seconds to spawn the instance on the hypervisor. Apr 18 16:16:43 user nova-compute[70975]: DEBUG nova.compute.manager [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:16:43 user nova-compute[70975]: DEBUG nova.compute.manager [req-886a7820-35c0-47c5-9804-9997ce894c26 req-907df151-834e-440d-827a-6372f338232a service nova] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Received event network-vif-plugged-21586886-79a5-4cab-bcfe-b52b65fbf177 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:16:43 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-886a7820-35c0-47c5-9804-9997ce894c26 req-907df151-834e-440d-827a-6372f338232a service nova] Acquiring lock "0ad9c135-f279-4bd8-982d-65b45242adcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:16:43 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-886a7820-35c0-47c5-9804-9997ce894c26 req-907df151-834e-440d-827a-6372f338232a service nova] Lock "0ad9c135-f279-4bd8-982d-65b45242adcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:16:43 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-886a7820-35c0-47c5-9804-9997ce894c26 req-907df151-834e-440d-827a-6372f338232a service nova] Lock "0ad9c135-f279-4bd8-982d-65b45242adcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:16:43 user nova-compute[70975]: DEBUG nova.compute.manager [req-886a7820-35c0-47c5-9804-9997ce894c26 req-907df151-834e-440d-827a-6372f338232a service nova] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] No waiting events found dispatching network-vif-plugged-21586886-79a5-4cab-bcfe-b52b65fbf177 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:16:43 user nova-compute[70975]: WARNING nova.compute.manager [req-886a7820-35c0-47c5-9804-9997ce894c26 req-907df151-834e-440d-827a-6372f338232a service nova] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Received unexpected event network-vif-plugged-21586886-79a5-4cab-bcfe-b52b65fbf177 for instance with vm_state building and task_state spawning. Apr 18 16:16:43 user nova-compute[70975]: INFO nova.compute.manager [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Took 5.86 seconds to build instance. Apr 18 16:16:43 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c6686c47-1616-4faf-b902-65a26c4ec2ed tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lock "0ad9c135-f279-4bd8-982d-65b45242adcf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 5.951s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:16:44 user nova-compute[70975]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:16:44 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] VM Stopped (Lifecycle Event) Apr 18 16:16:44 user nova-compute[70975]: DEBUG nova.compute.manager [None req-dfbbaad9-81c8-4c74-8a15-166a410bcfe8 None None] [instance: aaac3797-349f-4695-bea2-8b0c022a66e0] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:16:45 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:46 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:47 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:49 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:50 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:50 user nova-compute[70975]: DEBUG nova.compute.manager [req-7d6e8e18-2cc3-4648-8fa0-64227d998318 req-b46e3c19-7163-426d-bfb5-3412ac920ff5 service nova] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Received event network-changed-13606f1d-602f-4c77-b90b-32322653e54e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:16:50 user nova-compute[70975]: DEBUG nova.compute.manager [req-7d6e8e18-2cc3-4648-8fa0-64227d998318 req-b46e3c19-7163-426d-bfb5-3412ac920ff5 service nova] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Refreshing instance network info cache due to event network-changed-13606f1d-602f-4c77-b90b-32322653e54e. {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 18 16:16:50 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-7d6e8e18-2cc3-4648-8fa0-64227d998318 req-b46e3c19-7163-426d-bfb5-3412ac920ff5 service nova] Acquiring lock "refresh_cache-8e1ccfc5-90a7-443f-83e2-c07be27d6c7c" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:16:50 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-7d6e8e18-2cc3-4648-8fa0-64227d998318 req-b46e3c19-7163-426d-bfb5-3412ac920ff5 service nova] Acquired lock "refresh_cache-8e1ccfc5-90a7-443f-83e2-c07be27d6c7c" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:16:50 user nova-compute[70975]: DEBUG nova.network.neutron [req-7d6e8e18-2cc3-4648-8fa0-64227d998318 req-b46e3c19-7163-426d-bfb5-3412ac920ff5 service nova] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Refreshing network info cache for port 13606f1d-602f-4c77-b90b-32322653e54e {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 18 16:16:51 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:51 user nova-compute[70975]: DEBUG nova.network.neutron [req-7d6e8e18-2cc3-4648-8fa0-64227d998318 req-b46e3c19-7163-426d-bfb5-3412ac920ff5 service nova] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Updated VIF entry in instance network info cache for port 13606f1d-602f-4c77-b90b-32322653e54e. {{(pid=70975) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 18 16:16:51 user nova-compute[70975]: DEBUG nova.network.neutron [req-7d6e8e18-2cc3-4648-8fa0-64227d998318 req-b46e3c19-7163-426d-bfb5-3412ac920ff5 service nova] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Updating instance_info_cache with network_info: [{"id": "13606f1d-602f-4c77-b90b-32322653e54e", "address": "fa:16:3e:dc:c9:32", "network": {"id": "7f49a051-667b-4e91-80de-f4bbf2d6f09e", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-316224389-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.97", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d82a93c1cb9b4a4da7114874ddf0aa27", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap13606f1d-60", "ovs_interfaceid": "13606f1d-602f-4c77-b90b-32322653e54e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:16:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-7d6e8e18-2cc3-4648-8fa0-64227d998318 req-b46e3c19-7163-426d-bfb5-3412ac920ff5 service nova] Releasing lock "refresh_cache-8e1ccfc5-90a7-443f-83e2-c07be27d6c7c" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:16:52 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:52 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-22aa69f8-b70e-4d33-b6bc-a5043060e74b tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Acquiring lock "8e1ccfc5-90a7-443f-83e2-c07be27d6c7c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:16:52 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-22aa69f8-b70e-4d33-b6bc-a5043060e74b tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lock "8e1ccfc5-90a7-443f-83e2-c07be27d6c7c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:16:52 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-22aa69f8-b70e-4d33-b6bc-a5043060e74b tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Acquiring lock "8e1ccfc5-90a7-443f-83e2-c07be27d6c7c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:16:52 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-22aa69f8-b70e-4d33-b6bc-a5043060e74b tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lock "8e1ccfc5-90a7-443f-83e2-c07be27d6c7c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:16:52 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-22aa69f8-b70e-4d33-b6bc-a5043060e74b tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lock "8e1ccfc5-90a7-443f-83e2-c07be27d6c7c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:16:52 user nova-compute[70975]: INFO nova.compute.manager [None req-22aa69f8-b70e-4d33-b6bc-a5043060e74b tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Terminating instance Apr 18 16:16:52 user nova-compute[70975]: DEBUG nova.compute.manager [None req-22aa69f8-b70e-4d33-b6bc-a5043060e74b tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Start destroying the instance on the hypervisor. {{(pid=70975) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 18 16:16:52 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:52 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:52 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:52 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:52 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Acquiring lock "e32196da-a530-4422-8566-5edb01f3cc62" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:16:52 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lock "e32196da-a530-4422-8566-5edb01f3cc62" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:16:52 user nova-compute[70975]: DEBUG nova.compute.manager [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Starting instance... {{(pid=70975) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 18 16:16:52 user nova-compute[70975]: DEBUG nova.compute.manager [req-871b82ac-6874-471c-8dd4-531c2d8ea002 req-c4fc2e36-c798-42b2-b31e-ce9680965f0b service nova] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Received event network-vif-unplugged-13606f1d-602f-4c77-b90b-32322653e54e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:16:52 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-871b82ac-6874-471c-8dd4-531c2d8ea002 req-c4fc2e36-c798-42b2-b31e-ce9680965f0b service nova] Acquiring lock "8e1ccfc5-90a7-443f-83e2-c07be27d6c7c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:16:52 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-871b82ac-6874-471c-8dd4-531c2d8ea002 req-c4fc2e36-c798-42b2-b31e-ce9680965f0b service nova] Lock "8e1ccfc5-90a7-443f-83e2-c07be27d6c7c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:16:52 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-871b82ac-6874-471c-8dd4-531c2d8ea002 req-c4fc2e36-c798-42b2-b31e-ce9680965f0b service nova] Lock "8e1ccfc5-90a7-443f-83e2-c07be27d6c7c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:16:52 user nova-compute[70975]: DEBUG nova.compute.manager [req-871b82ac-6874-471c-8dd4-531c2d8ea002 req-c4fc2e36-c798-42b2-b31e-ce9680965f0b service nova] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] No waiting events found dispatching network-vif-unplugged-13606f1d-602f-4c77-b90b-32322653e54e {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:16:52 user nova-compute[70975]: DEBUG nova.compute.manager [req-871b82ac-6874-471c-8dd4-531c2d8ea002 req-c4fc2e36-c798-42b2-b31e-ce9680965f0b service nova] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Received event network-vif-unplugged-13606f1d-602f-4c77-b90b-32322653e54e for instance with task_state deleting. {{(pid=70975) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 18 16:16:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:16:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:16:53 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70975) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 18 16:16:53 user nova-compute[70975]: INFO nova.compute.claims [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Claim successful on node user Apr 18 16:16:53 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Instance destroyed successfully. Apr 18 16:16:53 user nova-compute[70975]: DEBUG nova.objects.instance [None req-22aa69f8-b70e-4d33-b6bc-a5043060e74b tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lazy-loading 'resources' on Instance uuid 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:16:53 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-22aa69f8-b70e-4d33-b6bc-a5043060e74b tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:14:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-1897500495',display_name='tempest-AttachVolumeTestJSON-server-1897500495',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-1897500495',id=8,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLNl9658QV9oAyWsF+PCDoKNB8f2Ysl88swP+0slbqtbCbBmKteLMBpQfjt+1JvV5krJu0v93BvOWlct8ODb6udN1fTuqEBomWxKiKxQ9Jd2pVu6lIa5zb/YgKK7JjSPmQ==',key_name='tempest-keypair-1312731460',keypairs=,launch_index=0,launched_at=2023-04-18T16:15:08Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='d82a93c1cb9b4a4da7114874ddf0aa27',ramdisk_id='',reservation_id='r-wiop3oaa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeTestJSON-313351389',owner_user_name='tempest-AttachVolumeTestJSON-313351389-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-18T16:15:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fd46686fd5b845cca0f3d9452a86f4ca',uuid=8e1ccfc5-90a7-443f-83e2-c07be27d6c7c,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "13606f1d-602f-4c77-b90b-32322653e54e", "address": "fa:16:3e:dc:c9:32", "network": {"id": "7f49a051-667b-4e91-80de-f4bbf2d6f09e", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-316224389-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.97", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d82a93c1cb9b4a4da7114874ddf0aa27", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap13606f1d-60", "ovs_interfaceid": "13606f1d-602f-4c77-b90b-32322653e54e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 18 16:16:53 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-22aa69f8-b70e-4d33-b6bc-a5043060e74b tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Converting VIF {"id": "13606f1d-602f-4c77-b90b-32322653e54e", "address": "fa:16:3e:dc:c9:32", "network": {"id": "7f49a051-667b-4e91-80de-f4bbf2d6f09e", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-316224389-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.97", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d82a93c1cb9b4a4da7114874ddf0aa27", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap13606f1d-60", "ovs_interfaceid": "13606f1d-602f-4c77-b90b-32322653e54e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:16:53 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-22aa69f8-b70e-4d33-b6bc-a5043060e74b tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:dc:c9:32,bridge_name='br-int',has_traffic_filtering=True,id=13606f1d-602f-4c77-b90b-32322653e54e,network=Network(7f49a051-667b-4e91-80de-f4bbf2d6f09e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13606f1d-60') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:16:53 user nova-compute[70975]: DEBUG os_vif [None req-22aa69f8-b70e-4d33-b6bc-a5043060e74b tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:dc:c9:32,bridge_name='br-int',has_traffic_filtering=True,id=13606f1d-602f-4c77-b90b-32322653e54e,network=Network(7f49a051-667b-4e91-80de-f4bbf2d6f09e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13606f1d-60') {{(pid=70975) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 18 16:16:53 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:53 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap13606f1d-60, bridge=br-int, if_exists=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:16:53 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:53 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:16:53 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:16:53 user nova-compute[70975]: INFO os_vif [None req-22aa69f8-b70e-4d33-b6bc-a5043060e74b tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:dc:c9:32,bridge_name='br-int',has_traffic_filtering=True,id=13606f1d-602f-4c77-b90b-32322653e54e,network=Network(7f49a051-667b-4e91-80de-f4bbf2d6f09e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13606f1d-60') Apr 18 16:16:53 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-22aa69f8-b70e-4d33-b6bc-a5043060e74b tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Deleting instance files /opt/stack/data/nova/instances/8e1ccfc5-90a7-443f-83e2-c07be27d6c7c_del Apr 18 16:16:53 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-22aa69f8-b70e-4d33-b6bc-a5043060e74b tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Deletion of /opt/stack/data/nova/instances/8e1ccfc5-90a7-443f-83e2-c07be27d6c7c_del complete Apr 18 16:16:53 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:16:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.422s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:16:53 user nova-compute[70975]: DEBUG nova.compute.manager [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Start building networks asynchronously for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 18 16:16:53 user nova-compute[70975]: INFO nova.compute.manager [None req-22aa69f8-b70e-4d33-b6bc-a5043060e74b tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Took 0.87 seconds to destroy the instance on the hypervisor. Apr 18 16:16:53 user nova-compute[70975]: DEBUG oslo.service.loopingcall [None req-22aa69f8-b70e-4d33-b6bc-a5043060e74b tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70975) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 18 16:16:53 user nova-compute[70975]: DEBUG nova.compute.manager [-] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Deallocating network for instance {{(pid=70975) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 18 16:16:53 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] deallocate_for_instance() {{(pid=70975) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 18 16:16:53 user nova-compute[70975]: DEBUG nova.compute.manager [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Allocating IP information in the background. {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 18 16:16:53 user nova-compute[70975]: DEBUG nova.network.neutron [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: e32196da-a530-4422-8566-5edb01f3cc62] allocate_for_instance() {{(pid=70975) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 18 16:16:53 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 18 16:16:53 user nova-compute[70975]: DEBUG nova.compute.manager [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Start building block device mappings for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 18 16:16:53 user nova-compute[70975]: DEBUG nova.policy [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '73a99bbf510f4f67bb7a35901ba3edc5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f9987eeaa6b24ca48e80e8d5318f02ac', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70975) authorize /opt/stack/nova/nova/policy.py:203}} Apr 18 16:16:53 user nova-compute[70975]: DEBUG nova.compute.manager [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Start spawning the instance on the hypervisor. {{(pid=70975) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 18 16:16:53 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Creating instance directory {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 18 16:16:53 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Creating image(s) Apr 18 16:16:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Acquiring lock "/opt/stack/data/nova/instances/e32196da-a530-4422-8566-5edb01f3cc62/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:16:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lock "/opt/stack/data/nova/instances/e32196da-a530-4422-8566-5edb01f3cc62/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:16:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lock "/opt/stack/data/nova/instances/e32196da-a530-4422-8566-5edb01f3cc62/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:16:53 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:16:53 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.132s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:16:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Acquiring lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:16:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.005s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:16:53 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:16:54 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.159s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:16:54 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/e32196da-a530-4422-8566-5edb01f3cc62/disk 1073741824 {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:16:54 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/e32196da-a530-4422-8566-5edb01f3cc62/disk 1073741824" returned: 0 in 0.052s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:16:54 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.218s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:16:54 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:16:54 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.177s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:16:54 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Checking if we can resize image /opt/stack/data/nova/instances/e32196da-a530-4422-8566-5edb01f3cc62/disk. size=1073741824 {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 18 16:16:54 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e32196da-a530-4422-8566-5edb01f3cc62/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:16:54 user nova-compute[70975]: DEBUG nova.network.neutron [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Successfully created port: 203a232c-488a-427e-bf18-e99feec680b6 {{(pid=70975) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 18 16:16:54 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Updating instance_info_cache with network_info: [] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:16:54 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Took 0.88 seconds to deallocate network for instance. Apr 18 16:16:54 user nova-compute[70975]: DEBUG nova.compute.manager [req-45ad7d17-5853-4ad6-9ad9-195fd5a81a1b req-4e8cad93-a9e8-46e4-99a2-11f51e054216 service nova] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Received event network-vif-deleted-13606f1d-602f-4c77-b90b-32322653e54e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:16:54 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e32196da-a530-4422-8566-5edb01f3cc62/disk --force-share --output=json" returned: 0 in 0.162s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:16:54 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Cannot resize image /opt/stack/data/nova/instances/e32196da-a530-4422-8566-5edb01f3cc62/disk to a smaller size. {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 18 16:16:54 user nova-compute[70975]: DEBUG nova.objects.instance [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lazy-loading 'migration_context' on Instance uuid e32196da-a530-4422-8566-5edb01f3cc62 {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:16:54 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Created local disks {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 18 16:16:54 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Ensure instance console log exists: /opt/stack/data/nova/instances/e32196da-a530-4422-8566-5edb01f3cc62/console.log {{(pid=70975) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 18 16:16:54 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:16:54 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:16:54 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:16:54 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-22aa69f8-b70e-4d33-b6bc-a5043060e74b tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:16:54 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-22aa69f8-b70e-4d33-b6bc-a5043060e74b tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:16:54 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-22aa69f8-b70e-4d33-b6bc-a5043060e74b tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:16:54 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-22aa69f8-b70e-4d33-b6bc-a5043060e74b tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:16:54 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-22aa69f8-b70e-4d33-b6bc-a5043060e74b tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.272s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:16:54 user nova-compute[70975]: INFO nova.scheduler.client.report [None req-22aa69f8-b70e-4d33-b6bc-a5043060e74b tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Deleted allocations for instance 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c Apr 18 16:16:54 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-22aa69f8-b70e-4d33-b6bc-a5043060e74b tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lock "8e1ccfc5-90a7-443f-83e2-c07be27d6c7c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.256s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG nova.compute.manager [req-a82fc315-1c74-4dbd-a052-1d25965bd4b3 req-de19b651-1167-4f8a-b713-0f768cac1bbc service nova] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Received event network-vif-plugged-13606f1d-602f-4c77-b90b-32322653e54e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-a82fc315-1c74-4dbd-a052-1d25965bd4b3 req-de19b651-1167-4f8a-b713-0f768cac1bbc service nova] Acquiring lock "8e1ccfc5-90a7-443f-83e2-c07be27d6c7c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-a82fc315-1c74-4dbd-a052-1d25965bd4b3 req-de19b651-1167-4f8a-b713-0f768cac1bbc service nova] Lock "8e1ccfc5-90a7-443f-83e2-c07be27d6c7c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-a82fc315-1c74-4dbd-a052-1d25965bd4b3 req-de19b651-1167-4f8a-b713-0f768cac1bbc service nova] Lock "8e1ccfc5-90a7-443f-83e2-c07be27d6c7c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG nova.compute.manager [req-a82fc315-1c74-4dbd-a052-1d25965bd4b3 req-de19b651-1167-4f8a-b713-0f768cac1bbc service nova] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] No waiting events found dispatching network-vif-plugged-13606f1d-602f-4c77-b90b-32322653e54e {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:16:55 user nova-compute[70975]: WARNING nova.compute.manager [req-a82fc315-1c74-4dbd-a052-1d25965bd4b3 req-de19b651-1167-4f8a-b713-0f768cac1bbc service nova] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Received unexpected event network-vif-plugged-13606f1d-602f-4c77-b90b-32322653e54e for instance with vm_state deleted and task_state None. Apr 18 16:16:55 user nova-compute[70975]: DEBUG nova.network.neutron [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Successfully updated port: 203a232c-488a-427e-bf18-e99feec680b6 {{(pid=70975) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Acquiring lock "refresh_cache-e32196da-a530-4422-8566-5edb01f3cc62" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Acquired lock "refresh_cache-e32196da-a530-4422-8566-5edb01f3cc62" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG nova.network.neutron [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Building network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG nova.network.neutron [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Instance cache missing network info. {{(pid=70975) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG nova.network.neutron [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Updating instance_info_cache with network_info: [{"id": "203a232c-488a-427e-bf18-e99feec680b6", "address": "fa:16:3e:e2:df:e9", "network": {"id": "51cddd0f-0e4b-4d37-be40-ce5592263bc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1803491920-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f9987eeaa6b24ca48e80e8d5318f02ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap203a232c-48", "ovs_interfaceid": "203a232c-488a-427e-bf18-e99feec680b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Releasing lock "refresh_cache-e32196da-a530-4422-8566-5edb01f3cc62" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG nova.compute.manager [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Instance network_info: |[{"id": "203a232c-488a-427e-bf18-e99feec680b6", "address": "fa:16:3e:e2:df:e9", "network": {"id": "51cddd0f-0e4b-4d37-be40-ce5592263bc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1803491920-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f9987eeaa6b24ca48e80e8d5318f02ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap203a232c-48", "ovs_interfaceid": "203a232c-488a-427e-bf18-e99feec680b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Start _get_guest_xml network_info=[{"id": "203a232c-488a-427e-bf18-e99feec680b6", "address": "fa:16:3e:e2:df:e9", "network": {"id": "51cddd0f-0e4b-4d37-be40-ce5592263bc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1803491920-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f9987eeaa6b24ca48e80e8d5318f02ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap203a232c-48", "ovs_interfaceid": "203a232c-488a-427e-bf18-e99feec680b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encrypted': False, 'device_type': 'disk', 'encryption_secret_uuid': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'b11a20de-f82a-4158-b53e-0a0c7a1552cb'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 18 16:16:55 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:16:55 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:16:55 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70975) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-18T16:11:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=), allow threads: True {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Flavor limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Image limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Flavor pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Image pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Got 1 possible topologies {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:16:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1520665803',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-1520665803',id=11,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIM5ALzjyybYMOK2Z1cVTnyDj3Z+LX/Xt2LMBmK17WbJDSTepQBn453Oo4oGngedtyHHoL/jHz286S3ijelgC//rOYsCgxrKNn3otRSI8UvONPGZU5icbqSs6c6+xBe3GQ==',key_name='tempest-keypair-48834850',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f9987eeaa6b24ca48e80e8d5318f02ac',ramdisk_id='',reservation_id='r-s3q9g9uz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-1663710151',owner_user_name='tempest-AttachVolumeShelveTestJSON-1663710151-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:16:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='73a99bbf510f4f67bb7a35901ba3edc5',uuid=e32196da-a530-4422-8566-5edb01f3cc62,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "203a232c-488a-427e-bf18-e99feec680b6", "address": "fa:16:3e:e2:df:e9", "network": {"id": "51cddd0f-0e4b-4d37-be40-ce5592263bc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1803491920-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f9987eeaa6b24ca48e80e8d5318f02ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap203a232c-48", "ovs_interfaceid": "203a232c-488a-427e-bf18-e99feec680b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70975) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Converting VIF {"id": "203a232c-488a-427e-bf18-e99feec680b6", "address": "fa:16:3e:e2:df:e9", "network": {"id": "51cddd0f-0e4b-4d37-be40-ce5592263bc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1803491920-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f9987eeaa6b24ca48e80e8d5318f02ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap203a232c-48", "ovs_interfaceid": "203a232c-488a-427e-bf18-e99feec680b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:df:e9,bridge_name='br-int',has_traffic_filtering=True,id=203a232c-488a-427e-bf18-e99feec680b6,network=Network(51cddd0f-0e4b-4d37-be40-ce5592263bc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap203a232c-48') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG nova.objects.instance [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lazy-loading 'pci_devices' on Instance uuid e32196da-a530-4422-8566-5edb01f3cc62 {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: e32196da-a530-4422-8566-5edb01f3cc62] End _get_guest_xml xml= Apr 18 16:16:55 user nova-compute[70975]: e32196da-a530-4422-8566-5edb01f3cc62 Apr 18 16:16:55 user nova-compute[70975]: instance-0000000b Apr 18 16:16:55 user nova-compute[70975]: 131072 Apr 18 16:16:55 user nova-compute[70975]: 1 Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: tempest-AttachVolumeShelveTestJSON-server-1520665803 Apr 18 16:16:55 user nova-compute[70975]: 2023-04-18 16:16:55 Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: 128 Apr 18 16:16:55 user nova-compute[70975]: 1 Apr 18 16:16:55 user nova-compute[70975]: 0 Apr 18 16:16:55 user nova-compute[70975]: 0 Apr 18 16:16:55 user nova-compute[70975]: 1 Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: tempest-AttachVolumeShelveTestJSON-1663710151-project-member Apr 18 16:16:55 user nova-compute[70975]: tempest-AttachVolumeShelveTestJSON-1663710151 Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: OpenStack Foundation Apr 18 16:16:55 user nova-compute[70975]: OpenStack Nova Apr 18 16:16:55 user nova-compute[70975]: 0.0.0 Apr 18 16:16:55 user nova-compute[70975]: e32196da-a530-4422-8566-5edb01f3cc62 Apr 18 16:16:55 user nova-compute[70975]: e32196da-a530-4422-8566-5edb01f3cc62 Apr 18 16:16:55 user nova-compute[70975]: Virtual Machine Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: hvm Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Nehalem Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: /dev/urandom Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: Apr 18 16:16:55 user nova-compute[70975]: {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:16:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1520665803',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-1520665803',id=11,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIM5ALzjyybYMOK2Z1cVTnyDj3Z+LX/Xt2LMBmK17WbJDSTepQBn453Oo4oGngedtyHHoL/jHz286S3ijelgC//rOYsCgxrKNn3otRSI8UvONPGZU5icbqSs6c6+xBe3GQ==',key_name='tempest-keypair-48834850',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f9987eeaa6b24ca48e80e8d5318f02ac',ramdisk_id='',reservation_id='r-s3q9g9uz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-1663710151',owner_user_name='tempest-AttachVolumeShelveTestJSON-1663710151-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:16:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='73a99bbf510f4f67bb7a35901ba3edc5',uuid=e32196da-a530-4422-8566-5edb01f3cc62,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "203a232c-488a-427e-bf18-e99feec680b6", "address": "fa:16:3e:e2:df:e9", "network": {"id": "51cddd0f-0e4b-4d37-be40-ce5592263bc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1803491920-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f9987eeaa6b24ca48e80e8d5318f02ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap203a232c-48", "ovs_interfaceid": "203a232c-488a-427e-bf18-e99feec680b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Converting VIF {"id": "203a232c-488a-427e-bf18-e99feec680b6", "address": "fa:16:3e:e2:df:e9", "network": {"id": "51cddd0f-0e4b-4d37-be40-ce5592263bc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1803491920-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f9987eeaa6b24ca48e80e8d5318f02ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap203a232c-48", "ovs_interfaceid": "203a232c-488a-427e-bf18-e99feec680b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:df:e9,bridge_name='br-int',has_traffic_filtering=True,id=203a232c-488a-427e-bf18-e99feec680b6,network=Network(51cddd0f-0e4b-4d37-be40-ce5592263bc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap203a232c-48') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG os_vif [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:df:e9,bridge_name='br-int',has_traffic_filtering=True,id=203a232c-488a-427e-bf18-e99feec680b6,network=Network(51cddd0f-0e4b-4d37-be40-ce5592263bc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap203a232c-48') {{(pid=70975) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap203a232c-48, may_exist=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap203a232c-48, col_values=(('external_ids', {'iface-id': '203a232c-488a-427e-bf18-e99feec680b6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e2:df:e9', 'vm-uuid': 'e32196da-a530-4422-8566-5edb01f3cc62'}),)) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:55 user nova-compute[70975]: INFO os_vif [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:df:e9,bridge_name='br-int',has_traffic_filtering=True,id=203a232c-488a-427e-bf18-e99feec680b6,network=Network(51cddd0f-0e4b-4d37-be40-ce5592263bc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap203a232c-48') Apr 18 16:16:55 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] No BDM found with device name vda, not building metadata. {{(pid=70975) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 18 16:16:55 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] No VIF found with MAC fa:16:3e:e2:df:e9, not building metadata {{(pid=70975) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 18 16:16:56 user nova-compute[70975]: DEBUG nova.compute.manager [req-5f11fa2c-2912-42d6-a26b-7e500d56e5c0 req-bf74449d-3170-4026-a8be-9839e3fa7290 service nova] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Received event network-changed-8029e455-c16d-48cd-93e1-cf56c226cc4a {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:16:56 user nova-compute[70975]: DEBUG nova.compute.manager [req-5f11fa2c-2912-42d6-a26b-7e500d56e5c0 req-bf74449d-3170-4026-a8be-9839e3fa7290 service nova] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Refreshing instance network info cache due to event network-changed-8029e455-c16d-48cd-93e1-cf56c226cc4a. {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 18 16:16:56 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-5f11fa2c-2912-42d6-a26b-7e500d56e5c0 req-bf74449d-3170-4026-a8be-9839e3fa7290 service nova] Acquiring lock "refresh_cache-8aaa4e97-9439-4760-9e05-8b248b02074f" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:16:56 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-5f11fa2c-2912-42d6-a26b-7e500d56e5c0 req-bf74449d-3170-4026-a8be-9839e3fa7290 service nova] Acquired lock "refresh_cache-8aaa4e97-9439-4760-9e05-8b248b02074f" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:16:56 user nova-compute[70975]: DEBUG nova.network.neutron [req-5f11fa2c-2912-42d6-a26b-7e500d56e5c0 req-bf74449d-3170-4026-a8be-9839e3fa7290 service nova] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Refreshing network info cache for port 8029e455-c16d-48cd-93e1-cf56c226cc4a {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 18 16:16:56 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:16:56 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:16:56 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70975) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 18 16:16:56 user nova-compute[70975]: DEBUG nova.network.neutron [req-5f11fa2c-2912-42d6-a26b-7e500d56e5c0 req-bf74449d-3170-4026-a8be-9839e3fa7290 service nova] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Updated VIF entry in instance network info cache for port 8029e455-c16d-48cd-93e1-cf56c226cc4a. {{(pid=70975) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 18 16:16:56 user nova-compute[70975]: DEBUG nova.network.neutron [req-5f11fa2c-2912-42d6-a26b-7e500d56e5c0 req-bf74449d-3170-4026-a8be-9839e3fa7290 service nova] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Updating instance_info_cache with network_info: [{"id": "8029e455-c16d-48cd-93e1-cf56c226cc4a", "address": "fa:16:3e:38:a4:82", "network": {"id": "7692c2b5-931d-4d1d-aae6-384ce4ff5ff0", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-144924554-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.121", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e6fc24a9e1b646a2a08df4f53f712267", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8029e455-c1", "ovs_interfaceid": "8029e455-c16d-48cd-93e1-cf56c226cc4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:16:56 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:57 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-5f11fa2c-2912-42d6-a26b-7e500d56e5c0 req-bf74449d-3170-4026-a8be-9839e3fa7290 service nova] Releasing lock "refresh_cache-8aaa4e97-9439-4760-9e05-8b248b02074f" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:16:57 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:57 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:57 user nova-compute[70975]: DEBUG nova.compute.manager [req-751e2e18-07af-4145-8c4e-1b49e5284dbf req-482bba64-9449-4b5e-a9ef-25801c1a9e1a service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Received event network-changed-203a232c-488a-427e-bf18-e99feec680b6 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:16:57 user nova-compute[70975]: DEBUG nova.compute.manager [req-751e2e18-07af-4145-8c4e-1b49e5284dbf req-482bba64-9449-4b5e-a9ef-25801c1a9e1a service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Refreshing instance network info cache due to event network-changed-203a232c-488a-427e-bf18-e99feec680b6. {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 18 16:16:57 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-751e2e18-07af-4145-8c4e-1b49e5284dbf req-482bba64-9449-4b5e-a9ef-25801c1a9e1a service nova] Acquiring lock "refresh_cache-e32196da-a530-4422-8566-5edb01f3cc62" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:16:57 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-751e2e18-07af-4145-8c4e-1b49e5284dbf req-482bba64-9449-4b5e-a9ef-25801c1a9e1a service nova] Acquired lock "refresh_cache-e32196da-a530-4422-8566-5edb01f3cc62" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:16:57 user nova-compute[70975]: DEBUG nova.network.neutron [req-751e2e18-07af-4145-8c4e-1b49e5284dbf req-482bba64-9449-4b5e-a9ef-25801c1a9e1a service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Refreshing network info cache for port 203a232c-488a-427e-bf18-e99feec680b6 {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 18 16:16:57 user nova-compute[70975]: DEBUG nova.compute.manager [req-2473d97e-7f2d-4e59-b1d1-5de0291d2851 req-faa749fb-3d81-4326-ba3e-562fa1b3fcdb service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Received event network-vif-plugged-203a232c-488a-427e-bf18-e99feec680b6 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:16:57 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-2473d97e-7f2d-4e59-b1d1-5de0291d2851 req-faa749fb-3d81-4326-ba3e-562fa1b3fcdb service nova] Acquiring lock "e32196da-a530-4422-8566-5edb01f3cc62-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:16:57 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-2473d97e-7f2d-4e59-b1d1-5de0291d2851 req-faa749fb-3d81-4326-ba3e-562fa1b3fcdb service nova] Lock "e32196da-a530-4422-8566-5edb01f3cc62-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:16:57 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-2473d97e-7f2d-4e59-b1d1-5de0291d2851 req-faa749fb-3d81-4326-ba3e-562fa1b3fcdb service nova] Lock "e32196da-a530-4422-8566-5edb01f3cc62-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:16:57 user nova-compute[70975]: DEBUG nova.compute.manager [req-2473d97e-7f2d-4e59-b1d1-5de0291d2851 req-faa749fb-3d81-4326-ba3e-562fa1b3fcdb service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] No waiting events found dispatching network-vif-plugged-203a232c-488a-427e-bf18-e99feec680b6 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:16:57 user nova-compute[70975]: WARNING nova.compute.manager [req-2473d97e-7f2d-4e59-b1d1-5de0291d2851 req-faa749fb-3d81-4326-ba3e-562fa1b3fcdb service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Received unexpected event network-vif-plugged-203a232c-488a-427e-bf18-e99feec680b6 for instance with vm_state building and task_state spawning. Apr 18 16:16:57 user nova-compute[70975]: DEBUG nova.compute.manager [None req-59d8e518-4715-41b3-9100-602e3a8747a7 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:16:57 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:57 user nova-compute[70975]: INFO nova.compute.manager [None req-59d8e518-4715-41b3-9100-602e3a8747a7 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] instance snapshotting Apr 18 16:16:57 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:16:57 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-59d8e518-4715-41b3-9100-602e3a8747a7 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Beginning live snapshot process Apr 18 16:16:57 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-59d8e518-4715-41b3-9100-602e3a8747a7 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8aaa4e97-9439-4760-9e05-8b248b02074f/disk --force-share --output=json -f qcow2 {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:16:57 user nova-compute[70975]: DEBUG nova.network.neutron [req-751e2e18-07af-4145-8c4e-1b49e5284dbf req-482bba64-9449-4b5e-a9ef-25801c1a9e1a service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Updated VIF entry in instance network info cache for port 203a232c-488a-427e-bf18-e99feec680b6. {{(pid=70975) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 18 16:16:57 user nova-compute[70975]: DEBUG nova.network.neutron [req-751e2e18-07af-4145-8c4e-1b49e5284dbf req-482bba64-9449-4b5e-a9ef-25801c1a9e1a service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Updating instance_info_cache with network_info: [{"id": "203a232c-488a-427e-bf18-e99feec680b6", "address": "fa:16:3e:e2:df:e9", "network": {"id": "51cddd0f-0e4b-4d37-be40-ce5592263bc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1803491920-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f9987eeaa6b24ca48e80e8d5318f02ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap203a232c-48", "ovs_interfaceid": "203a232c-488a-427e-bf18-e99feec680b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:16:57 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:57 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-751e2e18-07af-4145-8c4e-1b49e5284dbf req-482bba64-9449-4b5e-a9ef-25801c1a9e1a service nova] Releasing lock "refresh_cache-e32196da-a530-4422-8566-5edb01f3cc62" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:16:57 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:16:57 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-59d8e518-4715-41b3-9100-602e3a8747a7 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8aaa4e97-9439-4760-9e05-8b248b02074f/disk --force-share --output=json -f qcow2" returned: 0 in 0.149s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:16:57 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-59d8e518-4715-41b3-9100-602e3a8747a7 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8aaa4e97-9439-4760-9e05-8b248b02074f/disk --force-share --output=json -f qcow2 {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:16:57 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-59d8e518-4715-41b3-9100-602e3a8747a7 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8aaa4e97-9439-4760-9e05-8b248b02074f/disk --force-share --output=json -f qcow2" returned: 0 in 0.149s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:16:57 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-59d8e518-4715-41b3-9100-602e3a8747a7 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:16:58 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-59d8e518-4715-41b3-9100-602e3a8747a7 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.127s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:16:58 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-59d8e518-4715-41b3-9100-602e3a8747a7 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmp4k_9oyxe/783b40d38cd84bf988b3b160602062c3.delta 1073741824 {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:16:58 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-59d8e518-4715-41b3-9100-602e3a8747a7 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmp4k_9oyxe/783b40d38cd84bf988b3b160602062c3.delta 1073741824" returned: 0 in 0.045s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:16:58 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-59d8e518-4715-41b3-9100-602e3a8747a7 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Quiescing instance not available: QEMU guest agent is not enabled. Apr 18 16:16:58 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:16:58 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:16:58 user nova-compute[70975]: DEBUG nova.virt.libvirt.guest [None req-59d8e518-4715-41b3-9100-602e3a8747a7 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] COPY block job progress, current cursor: 0 final cursor: 43778048 {{(pid=70975) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 18 16:16:59 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Resumed> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:16:59 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: e32196da-a530-4422-8566-5edb01f3cc62] VM Resumed (Lifecycle Event) Apr 18 16:16:59 user nova-compute[70975]: DEBUG nova.compute.manager [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Instance event wait completed in 0 seconds for {{(pid=70975) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 18 16:16:59 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Guest created on hypervisor {{(pid=70975) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 18 16:16:59 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Instance spawned successfully. Apr 18 16:16:59 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 18 16:16:59 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:16:59 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:16:59 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Found default for hw_cdrom_bus of ide {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:16:59 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Found default for hw_disk_bus of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:16:59 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Found default for hw_input_bus of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:16:59 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Found default for hw_pointer_model of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:16:59 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Found default for hw_video_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:16:59 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Found default for hw_vif_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:16:59 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: e32196da-a530-4422-8566-5edb01f3cc62] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:16:59 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Started> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:16:59 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: e32196da-a530-4422-8566-5edb01f3cc62] VM Started (Lifecycle Event) Apr 18 16:16:59 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:16:59 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:16:59 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: e32196da-a530-4422-8566-5edb01f3cc62] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:16:59 user nova-compute[70975]: INFO nova.compute.manager [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Took 5.52 seconds to spawn the instance on the hypervisor. Apr 18 16:16:59 user nova-compute[70975]: DEBUG nova.compute.manager [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:16:59 user nova-compute[70975]: DEBUG nova.virt.libvirt.guest [None req-59d8e518-4715-41b3-9100-602e3a8747a7 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] COPY block job progress, current cursor: 43778048 final cursor: 43778048 {{(pid=70975) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 18 16:16:59 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-59d8e518-4715-41b3-9100-602e3a8747a7 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Skipping quiescing instance: QEMU guest agent is not enabled. Apr 18 16:16:59 user nova-compute[70975]: INFO nova.compute.manager [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Took 6.33 seconds to build instance. Apr 18 16:16:59 user nova-compute[70975]: DEBUG nova.compute.manager [req-92d28f7f-01ec-4e48-8593-6d71e7d394cc req-be9012f6-fbaa-43a6-a3b7-9c48b0b36e34 service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Received event network-vif-plugged-203a232c-488a-427e-bf18-e99feec680b6 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:16:59 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-92d28f7f-01ec-4e48-8593-6d71e7d394cc req-be9012f6-fbaa-43a6-a3b7-9c48b0b36e34 service nova] Acquiring lock "e32196da-a530-4422-8566-5edb01f3cc62-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:16:59 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-92d28f7f-01ec-4e48-8593-6d71e7d394cc req-be9012f6-fbaa-43a6-a3b7-9c48b0b36e34 service nova] Lock "e32196da-a530-4422-8566-5edb01f3cc62-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:16:59 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-92d28f7f-01ec-4e48-8593-6d71e7d394cc req-be9012f6-fbaa-43a6-a3b7-9c48b0b36e34 service nova] Lock "e32196da-a530-4422-8566-5edb01f3cc62-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:16:59 user nova-compute[70975]: DEBUG nova.compute.manager [req-92d28f7f-01ec-4e48-8593-6d71e7d394cc req-be9012f6-fbaa-43a6-a3b7-9c48b0b36e34 service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] No waiting events found dispatching network-vif-plugged-203a232c-488a-427e-bf18-e99feec680b6 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:16:59 user nova-compute[70975]: WARNING nova.compute.manager [req-92d28f7f-01ec-4e48-8593-6d71e7d394cc req-be9012f6-fbaa-43a6-a3b7-9c48b0b36e34 service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Received unexpected event network-vif-plugged-203a232c-488a-427e-bf18-e99feec680b6 for instance with vm_state active and task_state None. Apr 18 16:16:59 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-c24fde5a-f2f8-4a21-b57f-6f5257969f04 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lock "e32196da-a530-4422-8566-5edb01f3cc62" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.431s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:16:59 user nova-compute[70975]: DEBUG nova.privsep.utils [None req-59d8e518-4715-41b3-9100-602e3a8747a7 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=70975) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 18 16:16:59 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-59d8e518-4715-41b3-9100-602e3a8747a7 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmp4k_9oyxe/783b40d38cd84bf988b3b160602062c3.delta /opt/stack/data/nova/instances/snapshots/tmp4k_9oyxe/783b40d38cd84bf988b3b160602062c3 {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:16:59 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-59d8e518-4715-41b3-9100-602e3a8747a7 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmp4k_9oyxe/783b40d38cd84bf988b3b160602062c3.delta /opt/stack/data/nova/instances/snapshots/tmp4k_9oyxe/783b40d38cd84bf988b3b160602062c3" returned: 0 in 0.272s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:16:59 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-59d8e518-4715-41b3-9100-602e3a8747a7 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Snapshot extracted, beginning image upload Apr 18 16:17:00 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:17:00 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Starting heal instance info cache {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 18 16:17:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "refresh_cache-1b530349-680e-4def-86ef-29c340efa175" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:17:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquired lock "refresh_cache-1b530349-680e-4def-86ef-29c340efa175" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:17:00 user nova-compute[70975]: DEBUG nova.network.neutron [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: 1b530349-680e-4def-86ef-29c340efa175] Forcefully refreshing network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 18 16:17:00 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:00 user nova-compute[70975]: DEBUG nova.network.neutron [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: 1b530349-680e-4def-86ef-29c340efa175] Updating instance_info_cache with network_info: [{"id": "64d26c20-add4-4a63-bace-6a3678032692", "address": "fa:16:3e:33:ec:46", "network": {"id": "f5beaf4a-eeaf-454b-bde5-dd5e1f15e9dd", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-215585786-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "caa61b19cc4e4cd4bb7d41291c40ef1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap64d26c20-ad", "ovs_interfaceid": "64d26c20-add4-4a63-bace-6a3678032692", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:17:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Releasing lock "refresh_cache-1b530349-680e-4def-86ef-29c340efa175" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:17:00 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: 1b530349-680e-4def-86ef-29c340efa175] Updated the network info_cache for instance {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 18 16:17:00 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:17:00 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:17:00 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager.update_available_resource {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:17:01 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:17:01 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:17:01 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:17:01 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Auditing locally available compute resources for user (node: user) {{(pid=70975) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 18 16:17:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e32196da-a530-4422-8566-5edb01f3cc62/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:17:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e32196da-a530-4422-8566-5edb01f3cc62/disk --force-share --output=json" returned: 0 in 0.209s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:17:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e32196da-a530-4422-8566-5edb01f3cc62/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:17:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e32196da-a530-4422-8566-5edb01f3cc62/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:17:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk.rescue --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:17:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk.rescue --force-share --output=json" returned: 0 in 0.143s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:17:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk.rescue --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:17:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk.rescue --force-share --output=json" returned: 0 in 0.135s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:17:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:17:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:17:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:17:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk --force-share --output=json" returned: 0 in 0.157s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:17:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/da82d905-1ca1-403d-9598-7561e69b9704/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:17:02 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-59d8e518-4715-41b3-9100-602e3a8747a7 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Snapshot image upload complete Apr 18 16:17:02 user nova-compute[70975]: INFO nova.compute.manager [None req-59d8e518-4715-41b3-9100-602e3a8747a7 tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Took 4.78 seconds to snapshot the instance on the hypervisor. Apr 18 16:17:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/da82d905-1ca1-403d-9598-7561e69b9704/disk --force-share --output=json" returned: 0 in 0.150s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:17:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/da82d905-1ca1-403d-9598-7561e69b9704/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:17:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/da82d905-1ca1-403d-9598-7561e69b9704/disk --force-share --output=json" returned: 0 in 0.156s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:17:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8aaa4e97-9439-4760-9e05-8b248b02074f/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:17:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8aaa4e97-9439-4760-9e05-8b248b02074f/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:17:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8aaa4e97-9439-4760-9e05-8b248b02074f/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:17:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8aaa4e97-9439-4760-9e05-8b248b02074f/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:17:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:17:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:17:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:17:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:17:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6c592508-0444-4b42-a0b5-e3d8bd97f5ba/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:17:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6c592508-0444-4b42-a0b5-e3d8bd97f5ba/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:17:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6c592508-0444-4b42-a0b5-e3d8bd97f5ba/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:17:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6c592508-0444-4b42-a0b5-e3d8bd97f5ba/disk --force-share --output=json" returned: 0 in 0.163s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:17:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0ad9c135-f279-4bd8-982d-65b45242adcf/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:17:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0ad9c135-f279-4bd8-982d-65b45242adcf/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:17:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0ad9c135-f279-4bd8-982d-65b45242adcf/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:17:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0ad9c135-f279-4bd8-982d-65b45242adcf/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:17:04 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:17:04 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:17:04 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Hypervisor/Node resource view: name=user free_ram=8325MB free_disk=26.554298400878906GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70975) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 18 16:17:04 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:17:04 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:17:04 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance da82d905-1ca1-403d-9598-7561e69b9704 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:17:04 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 1b530349-680e-4def-86ef-29c340efa175 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:17:04 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance d7a293bf-a9bd-424e-ba11-bbed7dfea41c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:17:04 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 6c592508-0444-4b42-a0b5-e3d8bd97f5ba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:17:04 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 8aaa4e97-9439-4760-9e05-8b248b02074f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:17:04 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 0ad9c135-f279-4bd8-982d-65b45242adcf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:17:04 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance e32196da-a530-4422-8566-5edb01f3cc62 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:17:04 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Total usable vcpus: 12, total allocated vcpus: 7 {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 18 16:17:04 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Final resource view: name=user phys_ram=16023MB used_ram=1408MB phys_disk=40GB used_disk=7GB total_vcpus=12 used_vcpus=7 pci_stats=[] {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 18 16:17:04 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:17:04 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:17:04 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Compute_service record updated for user:user {{(pid=70975) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 18 16:17:04 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.381s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:17:05 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:07 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:08 user nova-compute[70975]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:17:08 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] VM Stopped (Lifecycle Event) Apr 18 16:17:08 user nova-compute[70975]: DEBUG nova.compute.manager [None req-a370fe89-96d4-4d24-b6d8-867058aa5550 None None] [instance: 8e1ccfc5-90a7-443f-83e2-c07be27d6c7c] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:17:10 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Acquiring lock "f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:17:10 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:17:10 user nova-compute[70975]: DEBUG nova.compute.manager [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Starting instance... {{(pid=70975) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 18 16:17:10 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:17:10 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:17:10 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70975) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 18 16:17:10 user nova-compute[70975]: INFO nova.compute.claims [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Claim successful on node user Apr 18 16:17:10 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:10 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:17:10 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:17:10 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.332s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:17:10 user nova-compute[70975]: DEBUG nova.compute.manager [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Start building networks asynchronously for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 18 16:17:10 user nova-compute[70975]: DEBUG nova.compute.manager [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Allocating IP information in the background. {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 18 16:17:10 user nova-compute[70975]: DEBUG nova.network.neutron [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] allocate_for_instance() {{(pid=70975) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 18 16:17:10 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 18 16:17:10 user nova-compute[70975]: DEBUG nova.compute.manager [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Start building block device mappings for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 18 16:17:10 user nova-compute[70975]: DEBUG nova.policy [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'af90e17ec027463fa8793e8539c39e13', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6b4e8d8797be4c0e91b1401538f2eba8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70975) authorize /opt/stack/nova/nova/policy.py:203}} Apr 18 16:17:10 user nova-compute[70975]: DEBUG nova.compute.manager [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Start spawning the instance on the hypervisor. {{(pid=70975) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 18 16:17:10 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Creating instance directory {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 18 16:17:10 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Creating image(s) Apr 18 16:17:10 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Acquiring lock "/opt/stack/data/nova/instances/f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:17:10 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "/opt/stack/data/nova/instances/f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:17:10 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "/opt/stack/data/nova/instances/f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.017s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:17:10 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:17:11 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.147s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:17:11 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Acquiring lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:17:11 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:17:11 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:17:11 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.140s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:17:11 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc/disk 1073741824 {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:17:11 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc/disk 1073741824" returned: 0 in 0.046s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:17:11 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.193s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:17:11 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:17:11 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.139s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:17:11 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Checking if we can resize image /opt/stack/data/nova/instances/f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc/disk. size=1073741824 {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 18 16:17:11 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:17:11 user nova-compute[70975]: DEBUG nova.network.neutron [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Successfully created port: bfcdfd2e-b438-4386-bcae-7088ec17c0e6 {{(pid=70975) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 18 16:17:11 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:17:11 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Cannot resize image /opt/stack/data/nova/instances/f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc/disk to a smaller size. {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 18 16:17:11 user nova-compute[70975]: DEBUG nova.objects.instance [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lazy-loading 'migration_context' on Instance uuid f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:17:11 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Created local disks {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 18 16:17:11 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Ensure instance console log exists: /opt/stack/data/nova/instances/f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc/console.log {{(pid=70975) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 18 16:17:11 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:17:11 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:17:11 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG nova.network.neutron [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Successfully updated port: bfcdfd2e-b438-4386-bcae-7088ec17c0e6 {{(pid=70975) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Acquiring lock "refresh_cache-f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Acquired lock "refresh_cache-f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG nova.network.neutron [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Building network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG nova.compute.manager [req-e72d73b0-7589-4eb4-96e1-2df741e4ddfb req-1a94bb37-f496-4d35-8661-c45fbabe8839 service nova] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Received event network-changed-bfcdfd2e-b438-4386-bcae-7088ec17c0e6 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG nova.compute.manager [req-e72d73b0-7589-4eb4-96e1-2df741e4ddfb req-1a94bb37-f496-4d35-8661-c45fbabe8839 service nova] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Refreshing instance network info cache due to event network-changed-bfcdfd2e-b438-4386-bcae-7088ec17c0e6. {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-e72d73b0-7589-4eb4-96e1-2df741e4ddfb req-1a94bb37-f496-4d35-8661-c45fbabe8839 service nova] Acquiring lock "refresh_cache-f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG nova.network.neutron [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Instance cache missing network info. {{(pid=70975) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG nova.network.neutron [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Updating instance_info_cache with network_info: [{"id": "bfcdfd2e-b438-4386-bcae-7088ec17c0e6", "address": "fa:16:3e:9a:28:d5", "network": {"id": "02aca424-2923-404b-9c66-76bec89f82b7", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1255258227-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6b4e8d8797be4c0e91b1401538f2eba8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfcdfd2e-b4", "ovs_interfaceid": "bfcdfd2e-b438-4386-bcae-7088ec17c0e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Releasing lock "refresh_cache-f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG nova.compute.manager [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Instance network_info: |[{"id": "bfcdfd2e-b438-4386-bcae-7088ec17c0e6", "address": "fa:16:3e:9a:28:d5", "network": {"id": "02aca424-2923-404b-9c66-76bec89f82b7", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1255258227-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6b4e8d8797be4c0e91b1401538f2eba8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfcdfd2e-b4", "ovs_interfaceid": "bfcdfd2e-b438-4386-bcae-7088ec17c0e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-e72d73b0-7589-4eb4-96e1-2df741e4ddfb req-1a94bb37-f496-4d35-8661-c45fbabe8839 service nova] Acquired lock "refresh_cache-f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG nova.network.neutron [req-e72d73b0-7589-4eb4-96e1-2df741e4ddfb req-1a94bb37-f496-4d35-8661-c45fbabe8839 service nova] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Refreshing network info cache for port bfcdfd2e-b438-4386-bcae-7088ec17c0e6 {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Start _get_guest_xml network_info=[{"id": "bfcdfd2e-b438-4386-bcae-7088ec17c0e6", "address": "fa:16:3e:9a:28:d5", "network": {"id": "02aca424-2923-404b-9c66-76bec89f82b7", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1255258227-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6b4e8d8797be4c0e91b1401538f2eba8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfcdfd2e-b4", "ovs_interfaceid": "bfcdfd2e-b438-4386-bcae-7088ec17c0e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encrypted': False, 'device_type': 'disk', 'encryption_secret_uuid': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'b11a20de-f82a-4158-b53e-0a0c7a1552cb'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 18 16:17:12 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:17:12 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:17:12 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70975) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-18T16:11:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=), allow threads: True {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Flavor limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Image limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Flavor pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Image pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Got 1 possible topologies {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:17:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-14343453',display_name='tempest-AttachVolumeNegativeTest-server-14343453',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-14343453',id=12,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI2SDBHv75l7hW3tiq5hWHFRDYyei1QQIo9CQRQFQISK8RVXUcgtsJBeI8pkGbxlcETA/pFpFNDAjbdgyUlN3UoIYqsksl/hRT8/J7etZF7prNIypo7A3UV/2lzY82gGhg==',key_name='tempest-keypair-1742241088',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6b4e8d8797be4c0e91b1401538f2eba8',ramdisk_id='',reservation_id='r-rlj93r3i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-216357456',owner_user_name='tempest-AttachVolumeNegativeTest-216357456-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:17:11Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='af90e17ec027463fa8793e8539c39e13',uuid=f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bfcdfd2e-b438-4386-bcae-7088ec17c0e6", "address": "fa:16:3e:9a:28:d5", "network": {"id": "02aca424-2923-404b-9c66-76bec89f82b7", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1255258227-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6b4e8d8797be4c0e91b1401538f2eba8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfcdfd2e-b4", "ovs_interfaceid": "bfcdfd2e-b438-4386-bcae-7088ec17c0e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70975) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Converting VIF {"id": "bfcdfd2e-b438-4386-bcae-7088ec17c0e6", "address": "fa:16:3e:9a:28:d5", "network": {"id": "02aca424-2923-404b-9c66-76bec89f82b7", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1255258227-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6b4e8d8797be4c0e91b1401538f2eba8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfcdfd2e-b4", "ovs_interfaceid": "bfcdfd2e-b438-4386-bcae-7088ec17c0e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:28:d5,bridge_name='br-int',has_traffic_filtering=True,id=bfcdfd2e-b438-4386-bcae-7088ec17c0e6,network=Network(02aca424-2923-404b-9c66-76bec89f82b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfcdfd2e-b4') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG nova.objects.instance [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lazy-loading 'pci_devices' on Instance uuid f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] End _get_guest_xml xml= Apr 18 16:17:12 user nova-compute[70975]: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc Apr 18 16:17:12 user nova-compute[70975]: instance-0000000c Apr 18 16:17:12 user nova-compute[70975]: 131072 Apr 18 16:17:12 user nova-compute[70975]: 1 Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: tempest-AttachVolumeNegativeTest-server-14343453 Apr 18 16:17:12 user nova-compute[70975]: 2023-04-18 16:17:12 Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: 128 Apr 18 16:17:12 user nova-compute[70975]: 1 Apr 18 16:17:12 user nova-compute[70975]: 0 Apr 18 16:17:12 user nova-compute[70975]: 0 Apr 18 16:17:12 user nova-compute[70975]: 1 Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: tempest-AttachVolumeNegativeTest-216357456-project-member Apr 18 16:17:12 user nova-compute[70975]: tempest-AttachVolumeNegativeTest-216357456 Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: OpenStack Foundation Apr 18 16:17:12 user nova-compute[70975]: OpenStack Nova Apr 18 16:17:12 user nova-compute[70975]: 0.0.0 Apr 18 16:17:12 user nova-compute[70975]: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc Apr 18 16:17:12 user nova-compute[70975]: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc Apr 18 16:17:12 user nova-compute[70975]: Virtual Machine Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: hvm Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Nehalem Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: /dev/urandom Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: Apr 18 16:17:12 user nova-compute[70975]: {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:17:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-14343453',display_name='tempest-AttachVolumeNegativeTest-server-14343453',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-14343453',id=12,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI2SDBHv75l7hW3tiq5hWHFRDYyei1QQIo9CQRQFQISK8RVXUcgtsJBeI8pkGbxlcETA/pFpFNDAjbdgyUlN3UoIYqsksl/hRT8/J7etZF7prNIypo7A3UV/2lzY82gGhg==',key_name='tempest-keypair-1742241088',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6b4e8d8797be4c0e91b1401538f2eba8',ramdisk_id='',reservation_id='r-rlj93r3i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-216357456',owner_user_name='tempest-AttachVolumeNegativeTest-216357456-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:17:11Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='af90e17ec027463fa8793e8539c39e13',uuid=f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bfcdfd2e-b438-4386-bcae-7088ec17c0e6", "address": "fa:16:3e:9a:28:d5", "network": {"id": "02aca424-2923-404b-9c66-76bec89f82b7", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1255258227-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6b4e8d8797be4c0e91b1401538f2eba8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfcdfd2e-b4", "ovs_interfaceid": "bfcdfd2e-b438-4386-bcae-7088ec17c0e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Converting VIF {"id": "bfcdfd2e-b438-4386-bcae-7088ec17c0e6", "address": "fa:16:3e:9a:28:d5", "network": {"id": "02aca424-2923-404b-9c66-76bec89f82b7", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1255258227-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6b4e8d8797be4c0e91b1401538f2eba8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfcdfd2e-b4", "ovs_interfaceid": "bfcdfd2e-b438-4386-bcae-7088ec17c0e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:28:d5,bridge_name='br-int',has_traffic_filtering=True,id=bfcdfd2e-b438-4386-bcae-7088ec17c0e6,network=Network(02aca424-2923-404b-9c66-76bec89f82b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfcdfd2e-b4') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG os_vif [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:28:d5,bridge_name='br-int',has_traffic_filtering=True,id=bfcdfd2e-b438-4386-bcae-7088ec17c0e6,network=Network(02aca424-2923-404b-9c66-76bec89f82b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfcdfd2e-b4') {{(pid=70975) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbfcdfd2e-b4, may_exist=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbfcdfd2e-b4, col_values=(('external_ids', {'iface-id': 'bfcdfd2e-b438-4386-bcae-7088ec17c0e6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9a:28:d5', 'vm-uuid': 'f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc'}),)) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:12 user nova-compute[70975]: INFO os_vif [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:28:d5,bridge_name='br-int',has_traffic_filtering=True,id=bfcdfd2e-b438-4386-bcae-7088ec17c0e6,network=Network(02aca424-2923-404b-9c66-76bec89f82b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfcdfd2e-b4') Apr 18 16:17:12 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] No BDM found with device name vda, not building metadata. {{(pid=70975) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] No VIF found with MAC fa:16:3e:9a:28:d5, not building metadata {{(pid=70975) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG nova.network.neutron [req-e72d73b0-7589-4eb4-96e1-2df741e4ddfb req-1a94bb37-f496-4d35-8661-c45fbabe8839 service nova] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Updated VIF entry in instance network info cache for port bfcdfd2e-b438-4386-bcae-7088ec17c0e6. {{(pid=70975) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 18 16:17:12 user nova-compute[70975]: DEBUG nova.network.neutron [req-e72d73b0-7589-4eb4-96e1-2df741e4ddfb req-1a94bb37-f496-4d35-8661-c45fbabe8839 service nova] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Updating instance_info_cache with network_info: [{"id": "bfcdfd2e-b438-4386-bcae-7088ec17c0e6", "address": "fa:16:3e:9a:28:d5", "network": {"id": "02aca424-2923-404b-9c66-76bec89f82b7", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1255258227-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6b4e8d8797be4c0e91b1401538f2eba8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfcdfd2e-b4", "ovs_interfaceid": "bfcdfd2e-b438-4386-bcae-7088ec17c0e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:17:13 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-e72d73b0-7589-4eb4-96e1-2df741e4ddfb req-1a94bb37-f496-4d35-8661-c45fbabe8839 service nova] Releasing lock "refresh_cache-f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:17:13 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:14 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:14 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:14 user nova-compute[70975]: DEBUG nova.compute.manager [req-026951b8-440a-4fda-b247-a7560cffc441 req-b087aef4-8d77-4c56-b65d-211aa5a69212 service nova] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Received event network-vif-plugged-bfcdfd2e-b438-4386-bcae-7088ec17c0e6 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:17:14 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-026951b8-440a-4fda-b247-a7560cffc441 req-b087aef4-8d77-4c56-b65d-211aa5a69212 service nova] Acquiring lock "f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:17:14 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-026951b8-440a-4fda-b247-a7560cffc441 req-b087aef4-8d77-4c56-b65d-211aa5a69212 service nova] Lock "f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:17:14 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-026951b8-440a-4fda-b247-a7560cffc441 req-b087aef4-8d77-4c56-b65d-211aa5a69212 service nova] Lock "f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:17:14 user nova-compute[70975]: DEBUG nova.compute.manager [req-026951b8-440a-4fda-b247-a7560cffc441 req-b087aef4-8d77-4c56-b65d-211aa5a69212 service nova] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] No waiting events found dispatching network-vif-plugged-bfcdfd2e-b438-4386-bcae-7088ec17c0e6 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:17:14 user nova-compute[70975]: WARNING nova.compute.manager [req-026951b8-440a-4fda-b247-a7560cffc441 req-b087aef4-8d77-4c56-b65d-211aa5a69212 service nova] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Received unexpected event network-vif-plugged-bfcdfd2e-b438-4386-bcae-7088ec17c0e6 for instance with vm_state building and task_state spawning. Apr 18 16:17:14 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:14 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:14 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:14 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:14 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:16 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Resumed> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:17:16 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] VM Resumed (Lifecycle Event) Apr 18 16:17:16 user nova-compute[70975]: DEBUG nova.compute.manager [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Instance event wait completed in 0 seconds for {{(pid=70975) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 18 16:17:16 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Guest created on hypervisor {{(pid=70975) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 18 16:17:16 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Instance spawned successfully. Apr 18 16:17:16 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 18 16:17:16 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:17:16 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Found default for hw_cdrom_bus of ide {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:17:16 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Found default for hw_disk_bus of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:17:16 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Found default for hw_input_bus of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:17:16 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Found default for hw_pointer_model of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:17:16 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Found default for hw_video_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:17:16 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Found default for hw_vif_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:17:16 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:17:16 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:17:16 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Started> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:17:16 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] VM Started (Lifecycle Event) Apr 18 16:17:16 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:17:16 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:17:16 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:17:16 user nova-compute[70975]: INFO nova.compute.manager [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Took 5.24 seconds to spawn the instance on the hypervisor. Apr 18 16:17:16 user nova-compute[70975]: DEBUG nova.compute.manager [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:17:16 user nova-compute[70975]: DEBUG nova.compute.manager [req-332809a1-5c1f-46c9-94b5-0b696a7e6d72 req-89442140-9e89-49f4-9889-2e6cd0fbf183 service nova] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Received event network-vif-plugged-bfcdfd2e-b438-4386-bcae-7088ec17c0e6 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:17:16 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-332809a1-5c1f-46c9-94b5-0b696a7e6d72 req-89442140-9e89-49f4-9889-2e6cd0fbf183 service nova] Acquiring lock "f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:17:16 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-332809a1-5c1f-46c9-94b5-0b696a7e6d72 req-89442140-9e89-49f4-9889-2e6cd0fbf183 service nova] Lock "f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:17:16 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-332809a1-5c1f-46c9-94b5-0b696a7e6d72 req-89442140-9e89-49f4-9889-2e6cd0fbf183 service nova] Lock "f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:17:16 user nova-compute[70975]: DEBUG nova.compute.manager [req-332809a1-5c1f-46c9-94b5-0b696a7e6d72 req-89442140-9e89-49f4-9889-2e6cd0fbf183 service nova] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] No waiting events found dispatching network-vif-plugged-bfcdfd2e-b438-4386-bcae-7088ec17c0e6 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:17:16 user nova-compute[70975]: WARNING nova.compute.manager [req-332809a1-5c1f-46c9-94b5-0b696a7e6d72 req-89442140-9e89-49f4-9889-2e6cd0fbf183 service nova] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Received unexpected event network-vif-plugged-bfcdfd2e-b438-4386-bcae-7088ec17c0e6 for instance with vm_state building and task_state spawning. Apr 18 16:17:16 user nova-compute[70975]: INFO nova.compute.manager [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Took 6.18 seconds to build instance. Apr 18 16:17:16 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-15398257-0711-4956-963c-876a15c8f6d3 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.272s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:17:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:22 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:22 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:22 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:25 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:27 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:27 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:31 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:32 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:32 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:34 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:37 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:37 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:38 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:41 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:42 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:42 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:46 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Acquiring lock "b71bd3c1-da58-4cb0-abc3-650e11b9d4ce" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:17:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lock "b71bd3c1-da58-4cb0-abc3-650e11b9d4ce" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:17:47 user nova-compute[70975]: DEBUG nova.compute.manager [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Starting instance... {{(pid=70975) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 18 16:17:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:17:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:17:47 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70975) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 18 16:17:47 user nova-compute[70975]: INFO nova.compute.claims [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Claim successful on node user Apr 18 16:17:47 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:47 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:47 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:17:47 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:17:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.395s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:17:47 user nova-compute[70975]: DEBUG nova.compute.manager [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Start building networks asynchronously for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 18 16:17:47 user nova-compute[70975]: DEBUG nova.compute.manager [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Allocating IP information in the background. {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 18 16:17:47 user nova-compute[70975]: DEBUG nova.network.neutron [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] allocate_for_instance() {{(pid=70975) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 18 16:17:47 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 18 16:17:47 user nova-compute[70975]: DEBUG nova.compute.manager [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Start building block device mappings for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 18 16:17:47 user nova-compute[70975]: DEBUG nova.policy [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fd46686fd5b845cca0f3d9452a86f4ca', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd82a93c1cb9b4a4da7114874ddf0aa27', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70975) authorize /opt/stack/nova/nova/policy.py:203}} Apr 18 16:17:48 user nova-compute[70975]: DEBUG nova.compute.manager [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Start spawning the instance on the hypervisor. {{(pid=70975) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 18 16:17:48 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Creating instance directory {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 18 16:17:48 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Creating image(s) Apr 18 16:17:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Acquiring lock "/opt/stack/data/nova/instances/b71bd3c1-da58-4cb0-abc3-650e11b9d4ce/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:17:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lock "/opt/stack/data/nova/instances/b71bd3c1-da58-4cb0-abc3-650e11b9d4ce/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:17:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lock "/opt/stack/data/nova/instances/b71bd3c1-da58-4cb0-abc3-650e11b9d4ce/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:17:48 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:17:48 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.152s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:17:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Acquiring lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:17:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:17:48 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:17:48 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.128s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:17:48 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/b71bd3c1-da58-4cb0-abc3-650e11b9d4ce/disk 1073741824 {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:17:48 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/b71bd3c1-da58-4cb0-abc3-650e11b9d4ce/disk 1073741824" returned: 0 in 0.046s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:17:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.180s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:17:48 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:17:48 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.133s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:17:48 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Checking if we can resize image /opt/stack/data/nova/instances/b71bd3c1-da58-4cb0-abc3-650e11b9d4ce/disk. size=1073741824 {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 18 16:17:48 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b71bd3c1-da58-4cb0-abc3-650e11b9d4ce/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:17:48 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b71bd3c1-da58-4cb0-abc3-650e11b9d4ce/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:17:48 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Cannot resize image /opt/stack/data/nova/instances/b71bd3c1-da58-4cb0-abc3-650e11b9d4ce/disk to a smaller size. {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 18 16:17:48 user nova-compute[70975]: DEBUG nova.objects.instance [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lazy-loading 'migration_context' on Instance uuid b71bd3c1-da58-4cb0-abc3-650e11b9d4ce {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:17:48 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Created local disks {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 18 16:17:48 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Ensure instance console log exists: /opt/stack/data/nova/instances/b71bd3c1-da58-4cb0-abc3-650e11b9d4ce/console.log {{(pid=70975) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 18 16:17:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:17:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:17:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:17:48 user nova-compute[70975]: DEBUG nova.network.neutron [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Successfully created port: 4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010 {{(pid=70975) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 18 16:17:49 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:49 user nova-compute[70975]: DEBUG nova.network.neutron [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Successfully updated port: 4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010 {{(pid=70975) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 18 16:17:49 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Acquiring lock "refresh_cache-b71bd3c1-da58-4cb0-abc3-650e11b9d4ce" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:17:49 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Acquired lock "refresh_cache-b71bd3c1-da58-4cb0-abc3-650e11b9d4ce" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:17:49 user nova-compute[70975]: DEBUG nova.network.neutron [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Building network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 18 16:17:49 user nova-compute[70975]: DEBUG nova.compute.manager [req-1efc91ea-fab2-4adf-8089-18be417eceb3 req-311abd58-6776-4f27-a9d4-21b0ef357abf service nova] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Received event network-changed-4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:17:49 user nova-compute[70975]: DEBUG nova.compute.manager [req-1efc91ea-fab2-4adf-8089-18be417eceb3 req-311abd58-6776-4f27-a9d4-21b0ef357abf service nova] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Refreshing instance network info cache due to event network-changed-4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010. {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 18 16:17:49 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-1efc91ea-fab2-4adf-8089-18be417eceb3 req-311abd58-6776-4f27-a9d4-21b0ef357abf service nova] Acquiring lock "refresh_cache-b71bd3c1-da58-4cb0-abc3-650e11b9d4ce" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:17:49 user nova-compute[70975]: DEBUG nova.network.neutron [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Instance cache missing network info. {{(pid=70975) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG nova.network.neutron [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Updating instance_info_cache with network_info: [{"id": "4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010", "address": "fa:16:3e:a3:29:06", "network": {"id": "7f49a051-667b-4e91-80de-f4bbf2d6f09e", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-316224389-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d82a93c1cb9b4a4da7114874ddf0aa27", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4aa3a6dd-3c", "ovs_interfaceid": "4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Releasing lock "refresh_cache-b71bd3c1-da58-4cb0-abc3-650e11b9d4ce" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG nova.compute.manager [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Instance network_info: |[{"id": "4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010", "address": "fa:16:3e:a3:29:06", "network": {"id": "7f49a051-667b-4e91-80de-f4bbf2d6f09e", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-316224389-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d82a93c1cb9b4a4da7114874ddf0aa27", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4aa3a6dd-3c", "ovs_interfaceid": "4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-1efc91ea-fab2-4adf-8089-18be417eceb3 req-311abd58-6776-4f27-a9d4-21b0ef357abf service nova] Acquired lock "refresh_cache-b71bd3c1-da58-4cb0-abc3-650e11b9d4ce" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG nova.network.neutron [req-1efc91ea-fab2-4adf-8089-18be417eceb3 req-311abd58-6776-4f27-a9d4-21b0ef357abf service nova] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Refreshing network info cache for port 4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010 {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Start _get_guest_xml network_info=[{"id": "4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010", "address": "fa:16:3e:a3:29:06", "network": {"id": "7f49a051-667b-4e91-80de-f4bbf2d6f09e", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-316224389-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d82a93c1cb9b4a4da7114874ddf0aa27", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4aa3a6dd-3c", "ovs_interfaceid": "4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encrypted': False, 'device_type': 'disk', 'encryption_secret_uuid': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'b11a20de-f82a-4158-b53e-0a0c7a1552cb'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 18 16:17:50 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:17:50 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:17:50 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70975) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-18T16:11:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=), allow threads: True {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Flavor limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Image limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Flavor pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Image pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Got 1 possible topologies {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:17:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-2075963637',display_name='tempest-AttachVolumeTestJSON-server-2075963637',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-2075963637',id=13,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAkq9Vg7VDQCpKpiGFoZfkEz1qZcQquI3n1H/unrAhcJuN8Zdg6SoPHia4dOkiKjV573Nr9cV3ZtHK+a5VfiLfEY5Cki6rbV4aTWzAjQWI/N4FbFpvBWX1A+Usn/9nq2QA==',key_name='tempest-keypair-1743850703',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d82a93c1cb9b4a4da7114874ddf0aa27',ramdisk_id='',reservation_id='r-00coyh3s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-313351389',owner_user_name='tempest-AttachVolumeTestJSON-313351389-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:17:48Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fd46686fd5b845cca0f3d9452a86f4ca',uuid=b71bd3c1-da58-4cb0-abc3-650e11b9d4ce,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010", "address": "fa:16:3e:a3:29:06", "network": {"id": "7f49a051-667b-4e91-80de-f4bbf2d6f09e", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-316224389-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d82a93c1cb9b4a4da7114874ddf0aa27", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4aa3a6dd-3c", "ovs_interfaceid": "4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70975) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Converting VIF {"id": "4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010", "address": "fa:16:3e:a3:29:06", "network": {"id": "7f49a051-667b-4e91-80de-f4bbf2d6f09e", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-316224389-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d82a93c1cb9b4a4da7114874ddf0aa27", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4aa3a6dd-3c", "ovs_interfaceid": "4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a3:29:06,bridge_name='br-int',has_traffic_filtering=True,id=4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010,network=Network(7f49a051-667b-4e91-80de-f4bbf2d6f09e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4aa3a6dd-3c') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG nova.objects.instance [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lazy-loading 'pci_devices' on Instance uuid b71bd3c1-da58-4cb0-abc3-650e11b9d4ce {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] End _get_guest_xml xml= Apr 18 16:17:50 user nova-compute[70975]: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce Apr 18 16:17:50 user nova-compute[70975]: instance-0000000d Apr 18 16:17:50 user nova-compute[70975]: 131072 Apr 18 16:17:50 user nova-compute[70975]: 1 Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: tempest-AttachVolumeTestJSON-server-2075963637 Apr 18 16:17:50 user nova-compute[70975]: 2023-04-18 16:17:50 Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: 128 Apr 18 16:17:50 user nova-compute[70975]: 1 Apr 18 16:17:50 user nova-compute[70975]: 0 Apr 18 16:17:50 user nova-compute[70975]: 0 Apr 18 16:17:50 user nova-compute[70975]: 1 Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: tempest-AttachVolumeTestJSON-313351389-project-member Apr 18 16:17:50 user nova-compute[70975]: tempest-AttachVolumeTestJSON-313351389 Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: OpenStack Foundation Apr 18 16:17:50 user nova-compute[70975]: OpenStack Nova Apr 18 16:17:50 user nova-compute[70975]: 0.0.0 Apr 18 16:17:50 user nova-compute[70975]: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce Apr 18 16:17:50 user nova-compute[70975]: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce Apr 18 16:17:50 user nova-compute[70975]: Virtual Machine Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: hvm Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Nehalem Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: /dev/urandom Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: Apr 18 16:17:50 user nova-compute[70975]: {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:17:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-2075963637',display_name='tempest-AttachVolumeTestJSON-server-2075963637',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-2075963637',id=13,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAkq9Vg7VDQCpKpiGFoZfkEz1qZcQquI3n1H/unrAhcJuN8Zdg6SoPHia4dOkiKjV573Nr9cV3ZtHK+a5VfiLfEY5Cki6rbV4aTWzAjQWI/N4FbFpvBWX1A+Usn/9nq2QA==',key_name='tempest-keypair-1743850703',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d82a93c1cb9b4a4da7114874ddf0aa27',ramdisk_id='',reservation_id='r-00coyh3s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-313351389',owner_user_name='tempest-AttachVolumeTestJSON-313351389-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:17:48Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fd46686fd5b845cca0f3d9452a86f4ca',uuid=b71bd3c1-da58-4cb0-abc3-650e11b9d4ce,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010", "address": "fa:16:3e:a3:29:06", "network": {"id": "7f49a051-667b-4e91-80de-f4bbf2d6f09e", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-316224389-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d82a93c1cb9b4a4da7114874ddf0aa27", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4aa3a6dd-3c", "ovs_interfaceid": "4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Converting VIF {"id": "4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010", "address": "fa:16:3e:a3:29:06", "network": {"id": "7f49a051-667b-4e91-80de-f4bbf2d6f09e", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-316224389-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d82a93c1cb9b4a4da7114874ddf0aa27", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4aa3a6dd-3c", "ovs_interfaceid": "4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a3:29:06,bridge_name='br-int',has_traffic_filtering=True,id=4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010,network=Network(7f49a051-667b-4e91-80de-f4bbf2d6f09e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4aa3a6dd-3c') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG os_vif [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a3:29:06,bridge_name='br-int',has_traffic_filtering=True,id=4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010,network=Network(7f49a051-667b-4e91-80de-f4bbf2d6f09e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4aa3a6dd-3c') {{(pid=70975) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4aa3a6dd-3c, may_exist=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4aa3a6dd-3c, col_values=(('external_ids', {'iface-id': '4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a3:29:06', 'vm-uuid': 'b71bd3c1-da58-4cb0-abc3-650e11b9d4ce'}),)) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:50 user nova-compute[70975]: INFO os_vif [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a3:29:06,bridge_name='br-int',has_traffic_filtering=True,id=4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010,network=Network(7f49a051-667b-4e91-80de-f4bbf2d6f09e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4aa3a6dd-3c') Apr 18 16:17:50 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] No BDM found with device name vda, not building metadata. {{(pid=70975) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] No VIF found with MAC fa:16:3e:a3:29:06, not building metadata {{(pid=70975) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG nova.network.neutron [req-1efc91ea-fab2-4adf-8089-18be417eceb3 req-311abd58-6776-4f27-a9d4-21b0ef357abf service nova] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Updated VIF entry in instance network info cache for port 4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010. {{(pid=70975) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG nova.network.neutron [req-1efc91ea-fab2-4adf-8089-18be417eceb3 req-311abd58-6776-4f27-a9d4-21b0ef357abf service nova] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Updating instance_info_cache with network_info: [{"id": "4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010", "address": "fa:16:3e:a3:29:06", "network": {"id": "7f49a051-667b-4e91-80de-f4bbf2d6f09e", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-316224389-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d82a93c1cb9b4a4da7114874ddf0aa27", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4aa3a6dd-3c", "ovs_interfaceid": "4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:17:50 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-1efc91ea-fab2-4adf-8089-18be417eceb3 req-311abd58-6776-4f27-a9d4-21b0ef357abf service nova] Releasing lock "refresh_cache-b71bd3c1-da58-4cb0-abc3-650e11b9d4ce" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:17:51 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:51 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:51 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:51 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:51 user nova-compute[70975]: DEBUG nova.compute.manager [req-c698486d-2dbc-4853-81d4-682c661d6d49 req-1c5f45fa-f67c-4b7c-8a76-492677287c99 service nova] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Received event network-vif-plugged-4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:17:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-c698486d-2dbc-4853-81d4-682c661d6d49 req-1c5f45fa-f67c-4b7c-8a76-492677287c99 service nova] Acquiring lock "b71bd3c1-da58-4cb0-abc3-650e11b9d4ce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:17:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-c698486d-2dbc-4853-81d4-682c661d6d49 req-1c5f45fa-f67c-4b7c-8a76-492677287c99 service nova] Lock "b71bd3c1-da58-4cb0-abc3-650e11b9d4ce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:17:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-c698486d-2dbc-4853-81d4-682c661d6d49 req-1c5f45fa-f67c-4b7c-8a76-492677287c99 service nova] Lock "b71bd3c1-da58-4cb0-abc3-650e11b9d4ce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:17:51 user nova-compute[70975]: DEBUG nova.compute.manager [req-c698486d-2dbc-4853-81d4-682c661d6d49 req-1c5f45fa-f67c-4b7c-8a76-492677287c99 service nova] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] No waiting events found dispatching network-vif-plugged-4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:17:51 user nova-compute[70975]: WARNING nova.compute.manager [req-c698486d-2dbc-4853-81d4-682c661d6d49 req-1c5f45fa-f67c-4b7c-8a76-492677287c99 service nova] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Received unexpected event network-vif-plugged-4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010 for instance with vm_state building and task_state spawning. Apr 18 16:17:52 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:52 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:52 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:52 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:52 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:53 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Resumed> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:17:53 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] VM Resumed (Lifecycle Event) Apr 18 16:17:53 user nova-compute[70975]: DEBUG nova.compute.manager [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Instance event wait completed in 0 seconds for {{(pid=70975) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 18 16:17:53 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Guest created on hypervisor {{(pid=70975) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 18 16:17:53 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Instance spawned successfully. Apr 18 16:17:53 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 18 16:17:53 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:17:53 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Found default for hw_cdrom_bus of ide {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:17:53 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Found default for hw_disk_bus of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:17:53 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Found default for hw_input_bus of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:17:53 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Found default for hw_pointer_model of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:17:53 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Found default for hw_video_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:17:53 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Found default for hw_vif_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:17:53 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:17:53 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:17:53 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Started> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:17:53 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] VM Started (Lifecycle Event) Apr 18 16:17:53 user nova-compute[70975]: DEBUG nova.compute.manager [req-4beb1438-51b4-4dc2-8861-2fd593acc0e1 req-dc4fd8b3-a526-4c02-bc8c-01ed9491fb42 service nova] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Received event network-vif-plugged-4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:17:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-4beb1438-51b4-4dc2-8861-2fd593acc0e1 req-dc4fd8b3-a526-4c02-bc8c-01ed9491fb42 service nova] Acquiring lock "b71bd3c1-da58-4cb0-abc3-650e11b9d4ce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:17:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-4beb1438-51b4-4dc2-8861-2fd593acc0e1 req-dc4fd8b3-a526-4c02-bc8c-01ed9491fb42 service nova] Lock "b71bd3c1-da58-4cb0-abc3-650e11b9d4ce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:17:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-4beb1438-51b4-4dc2-8861-2fd593acc0e1 req-dc4fd8b3-a526-4c02-bc8c-01ed9491fb42 service nova] Lock "b71bd3c1-da58-4cb0-abc3-650e11b9d4ce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:17:53 user nova-compute[70975]: DEBUG nova.compute.manager [req-4beb1438-51b4-4dc2-8861-2fd593acc0e1 req-dc4fd8b3-a526-4c02-bc8c-01ed9491fb42 service nova] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] No waiting events found dispatching network-vif-plugged-4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:17:53 user nova-compute[70975]: WARNING nova.compute.manager [req-4beb1438-51b4-4dc2-8861-2fd593acc0e1 req-dc4fd8b3-a526-4c02-bc8c-01ed9491fb42 service nova] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Received unexpected event network-vif-plugged-4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010 for instance with vm_state building and task_state spawning. Apr 18 16:17:53 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:17:54 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:17:54 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:17:54 user nova-compute[70975]: INFO nova.compute.manager [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Took 6.04 seconds to spawn the instance on the hypervisor. Apr 18 16:17:54 user nova-compute[70975]: DEBUG nova.compute.manager [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:17:54 user nova-compute[70975]: INFO nova.compute.manager [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Took 6.76 seconds to build instance. Apr 18 16:17:54 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-36bb0272-3ce1-4778-b58e-d527aa468220 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lock "b71bd3c1-da58-4cb0-abc3-650e11b9d4ce" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.855s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:17:55 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:55 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:57 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:58 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:17:59 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:17:59 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:17:59 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:17:59 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:17:59 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70975) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 18 16:17:59 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Acquiring lock "6528f05a-9f05-4f35-b991-687e4f47029e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:17:59 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "6528f05a-9f05-4f35-b991-687e4f47029e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:17:59 user nova-compute[70975]: DEBUG nova.compute.manager [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Starting instance... {{(pid=70975) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 18 16:18:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:18:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:18:00 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70975) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 18 16:18:00 user nova-compute[70975]: INFO nova.compute.claims [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Claim successful on node user Apr 18 16:18:00 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:00 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:18:00 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:18:00 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:18:00 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:18:00 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager.update_available_resource {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:18:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.391s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:18:00 user nova-compute[70975]: DEBUG nova.compute.manager [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Start building networks asynchronously for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 18 16:18:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:18:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:18:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:18:00 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Auditing locally available compute resources for user (node: user) {{(pid=70975) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 18 16:18:00 user nova-compute[70975]: DEBUG nova.compute.manager [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Allocating IP information in the background. {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 18 16:18:00 user nova-compute[70975]: DEBUG nova.network.neutron [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] allocate_for_instance() {{(pid=70975) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 18 16:18:00 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 18 16:18:00 user nova-compute[70975]: DEBUG nova.compute.manager [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Start building block device mappings for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 18 16:18:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e32196da-a530-4422-8566-5edb01f3cc62/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:18:00 user nova-compute[70975]: DEBUG nova.policy [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c54c277689214bd0a2cadb1e2ac288a9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f516f5ec45ca4508841c77f79e8c038b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70975) authorize /opt/stack/nova/nova/policy.py:203}} Apr 18 16:18:00 user nova-compute[70975]: DEBUG nova.compute.manager [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Start spawning the instance on the hypervisor. {{(pid=70975) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 18 16:18:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Creating instance directory {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 18 16:18:00 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Creating image(s) Apr 18 16:18:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Acquiring lock "/opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:18:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "/opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:18:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "/opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:18:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:18:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e32196da-a530-4422-8566-5edb01f3cc62/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:18:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e32196da-a530-4422-8566-5edb01f3cc62/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:18:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.129s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:18:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Acquiring lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:18:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:18:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:18:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e32196da-a530-4422-8566-5edb01f3cc62/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:18:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:18:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.131s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:18:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk 1073741824 {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:18:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk 1073741824" returned: 0 in 0.045s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:18:01 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.183s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:18:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:18:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:18:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:18:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.145s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:18:01 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Checking if we can resize image /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk. size=1073741824 {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 18 16:18:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:18:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc/disk --force-share --output=json" returned: 0 in 0.158s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:18:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b71bd3c1-da58-4cb0-abc3-650e11b9d4ce/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:18:01 user nova-compute[70975]: DEBUG nova.network.neutron [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Successfully created port: 08164ae1-ace4-4d80-ad79-1741eacfa16e {{(pid=70975) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 18 16:18:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json" returned: 0 in 0.156s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:18:01 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Cannot resize image /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk to a smaller size. {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 18 16:18:01 user nova-compute[70975]: DEBUG nova.objects.instance [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lazy-loading 'migration_context' on Instance uuid 6528f05a-9f05-4f35-b991-687e4f47029e {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:18:01 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Created local disks {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 18 16:18:01 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Ensure instance console log exists: /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/console.log {{(pid=70975) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 18 16:18:01 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:18:01 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:18:01 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:18:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b71bd3c1-da58-4cb0-abc3-650e11b9d4ce/disk --force-share --output=json" returned: 0 in 0.164s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:18:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b71bd3c1-da58-4cb0-abc3-650e11b9d4ce/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:18:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b71bd3c1-da58-4cb0-abc3-650e11b9d4ce/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:18:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk.rescue --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:18:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk.rescue --force-share --output=json" returned: 0 in 0.139s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:18:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk.rescue --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:18:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk.rescue --force-share --output=json" returned: 0 in 0.147s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:18:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:18:01 user nova-compute[70975]: DEBUG nova.network.neutron [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Successfully updated port: 08164ae1-ace4-4d80-ad79-1741eacfa16e {{(pid=70975) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 18 16:18:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:18:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Acquiring lock "refresh_cache-6528f05a-9f05-4f35-b991-687e4f47029e" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Acquired lock "refresh_cache-6528f05a-9f05-4f35-b991-687e4f47029e" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG nova.network.neutron [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Building network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG nova.compute.manager [req-8da7c696-7ccd-4c38-9674-dba57e11e644 req-2a9894fb-8445-4efa-82b0-304000383258 service nova] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Received event network-changed-08164ae1-ace4-4d80-ad79-1741eacfa16e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG nova.compute.manager [req-8da7c696-7ccd-4c38-9674-dba57e11e644 req-2a9894fb-8445-4efa-82b0-304000383258 service nova] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Refreshing instance network info cache due to event network-changed-08164ae1-ace4-4d80-ad79-1741eacfa16e. {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-8da7c696-7ccd-4c38-9674-dba57e11e644 req-2a9894fb-8445-4efa-82b0-304000383258 service nova] Acquiring lock "refresh_cache-6528f05a-9f05-4f35-b991-687e4f47029e" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG nova.network.neutron [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Instance cache missing network info. {{(pid=70975) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/da82d905-1ca1-403d-9598-7561e69b9704/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/da82d905-1ca1-403d-9598-7561e69b9704/disk --force-share --output=json" returned: 0 in 0.126s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/da82d905-1ca1-403d-9598-7561e69b9704/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG nova.network.neutron [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Updating instance_info_cache with network_info: [{"id": "08164ae1-ace4-4d80-ad79-1741eacfa16e", "address": "fa:16:3e:28:00:5b", "network": {"id": "923d10dc-c67e-4426-9c6e-856e903e2446", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-131274818-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f516f5ec45ca4508841c77f79e8c038b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap08164ae1-ac", "ovs_interfaceid": "08164ae1-ace4-4d80-ad79-1741eacfa16e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/da82d905-1ca1-403d-9598-7561e69b9704/disk --force-share --output=json" returned: 0 in 0.149s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8aaa4e97-9439-4760-9e05-8b248b02074f/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Releasing lock "refresh_cache-6528f05a-9f05-4f35-b991-687e4f47029e" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG nova.compute.manager [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Instance network_info: |[{"id": "08164ae1-ace4-4d80-ad79-1741eacfa16e", "address": "fa:16:3e:28:00:5b", "network": {"id": "923d10dc-c67e-4426-9c6e-856e903e2446", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-131274818-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f516f5ec45ca4508841c77f79e8c038b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap08164ae1-ac", "ovs_interfaceid": "08164ae1-ace4-4d80-ad79-1741eacfa16e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-8da7c696-7ccd-4c38-9674-dba57e11e644 req-2a9894fb-8445-4efa-82b0-304000383258 service nova] Acquired lock "refresh_cache-6528f05a-9f05-4f35-b991-687e4f47029e" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG nova.network.neutron [req-8da7c696-7ccd-4c38-9674-dba57e11e644 req-2a9894fb-8445-4efa-82b0-304000383258 service nova] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Refreshing network info cache for port 08164ae1-ace4-4d80-ad79-1741eacfa16e {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Start _get_guest_xml network_info=[{"id": "08164ae1-ace4-4d80-ad79-1741eacfa16e", "address": "fa:16:3e:28:00:5b", "network": {"id": "923d10dc-c67e-4426-9c6e-856e903e2446", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-131274818-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f516f5ec45ca4508841c77f79e8c038b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap08164ae1-ac", "ovs_interfaceid": "08164ae1-ace4-4d80-ad79-1741eacfa16e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encrypted': False, 'device_type': 'disk', 'encryption_secret_uuid': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'b11a20de-f82a-4158-b53e-0a0c7a1552cb'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 18 16:18:02 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:18:02 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:18:02 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70975) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-18T16:11:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=), allow threads: True {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Flavor limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Image limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Flavor pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Image pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Got 1 possible topologies {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:17:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1865674245',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1865674245',id=14,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f516f5ec45ca4508841c77f79e8c038b',ramdisk_id='',reservation_id='r-ad3dbxxv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-2021464272',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:18:01Z,user_data=None,user_id='c54c277689214bd0a2cadb1e2ac288a9',uuid=6528f05a-9f05-4f35-b991-687e4f47029e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "08164ae1-ace4-4d80-ad79-1741eacfa16e", "address": "fa:16:3e:28:00:5b", "network": {"id": "923d10dc-c67e-4426-9c6e-856e903e2446", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-131274818-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f516f5ec45ca4508841c77f79e8c038b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap08164ae1-ac", "ovs_interfaceid": "08164ae1-ace4-4d80-ad79-1741eacfa16e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70975) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Converting VIF {"id": "08164ae1-ace4-4d80-ad79-1741eacfa16e", "address": "fa:16:3e:28:00:5b", "network": {"id": "923d10dc-c67e-4426-9c6e-856e903e2446", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-131274818-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f516f5ec45ca4508841c77f79e8c038b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap08164ae1-ac", "ovs_interfaceid": "08164ae1-ace4-4d80-ad79-1741eacfa16e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:28:00:5b,bridge_name='br-int',has_traffic_filtering=True,id=08164ae1-ace4-4d80-ad79-1741eacfa16e,network=Network(923d10dc-c67e-4426-9c6e-856e903e2446),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap08164ae1-ac') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG nova.objects.instance [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lazy-loading 'pci_devices' on Instance uuid 6528f05a-9f05-4f35-b991-687e4f47029e {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8aaa4e97-9439-4760-9e05-8b248b02074f/disk --force-share --output=json" returned: 0 in 0.150s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8aaa4e97-9439-4760-9e05-8b248b02074f/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] End _get_guest_xml xml= Apr 18 16:18:02 user nova-compute[70975]: 6528f05a-9f05-4f35-b991-687e4f47029e Apr 18 16:18:02 user nova-compute[70975]: instance-0000000e Apr 18 16:18:02 user nova-compute[70975]: 131072 Apr 18 16:18:02 user nova-compute[70975]: 1 Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: tempest-ServerBootFromVolumeStableRescueTest-server-1865674245 Apr 18 16:18:02 user nova-compute[70975]: 2023-04-18 16:18:02 Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: 128 Apr 18 16:18:02 user nova-compute[70975]: 1 Apr 18 16:18:02 user nova-compute[70975]: 0 Apr 18 16:18:02 user nova-compute[70975]: 0 Apr 18 16:18:02 user nova-compute[70975]: 1 Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member Apr 18 16:18:02 user nova-compute[70975]: tempest-ServerBootFromVolumeStableRescueTest-2021464272 Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: OpenStack Foundation Apr 18 16:18:02 user nova-compute[70975]: OpenStack Nova Apr 18 16:18:02 user nova-compute[70975]: 0.0.0 Apr 18 16:18:02 user nova-compute[70975]: 6528f05a-9f05-4f35-b991-687e4f47029e Apr 18 16:18:02 user nova-compute[70975]: 6528f05a-9f05-4f35-b991-687e4f47029e Apr 18 16:18:02 user nova-compute[70975]: Virtual Machine Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: hvm Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Nehalem Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: /dev/urandom Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: Apr 18 16:18:02 user nova-compute[70975]: {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:17:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1865674245',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1865674245',id=14,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f516f5ec45ca4508841c77f79e8c038b',ramdisk_id='',reservation_id='r-ad3dbxxv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-2021464272',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:18:01Z,user_data=None,user_id='c54c277689214bd0a2cadb1e2ac288a9',uuid=6528f05a-9f05-4f35-b991-687e4f47029e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "08164ae1-ace4-4d80-ad79-1741eacfa16e", "address": "fa:16:3e:28:00:5b", "network": {"id": "923d10dc-c67e-4426-9c6e-856e903e2446", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-131274818-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f516f5ec45ca4508841c77f79e8c038b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap08164ae1-ac", "ovs_interfaceid": "08164ae1-ace4-4d80-ad79-1741eacfa16e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Converting VIF {"id": "08164ae1-ace4-4d80-ad79-1741eacfa16e", "address": "fa:16:3e:28:00:5b", "network": {"id": "923d10dc-c67e-4426-9c6e-856e903e2446", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-131274818-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f516f5ec45ca4508841c77f79e8c038b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap08164ae1-ac", "ovs_interfaceid": "08164ae1-ace4-4d80-ad79-1741eacfa16e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:28:00:5b,bridge_name='br-int',has_traffic_filtering=True,id=08164ae1-ace4-4d80-ad79-1741eacfa16e,network=Network(923d10dc-c67e-4426-9c6e-856e903e2446),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap08164ae1-ac') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG os_vif [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:28:00:5b,bridge_name='br-int',has_traffic_filtering=True,id=08164ae1-ace4-4d80-ad79-1741eacfa16e,network=Network(923d10dc-c67e-4426-9c6e-856e903e2446),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap08164ae1-ac') {{(pid=70975) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap08164ae1-ac, may_exist=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap08164ae1-ac, col_values=(('external_ids', {'iface-id': '08164ae1-ace4-4d80-ad79-1741eacfa16e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:28:00:5b', 'vm-uuid': '6528f05a-9f05-4f35-b991-687e4f47029e'}),)) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:02 user nova-compute[70975]: INFO os_vif [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:28:00:5b,bridge_name='br-int',has_traffic_filtering=True,id=08164ae1-ace4-4d80-ad79-1741eacfa16e,network=Network(923d10dc-c67e-4426-9c6e-856e903e2446),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap08164ae1-ac') Apr 18 16:18:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8aaa4e97-9439-4760-9e05-8b248b02074f/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] No BDM found with device name vda, not building metadata. {{(pid=70975) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] No VIF found with MAC fa:16:3e:28:00:5b, not building metadata {{(pid=70975) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:18:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6c592508-0444-4b42-a0b5-e3d8bd97f5ba/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:18:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6c592508-0444-4b42-a0b5-e3d8bd97f5ba/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:18:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6c592508-0444-4b42-a0b5-e3d8bd97f5ba/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:18:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6c592508-0444-4b42-a0b5-e3d8bd97f5ba/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:18:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0ad9c135-f279-4bd8-982d-65b45242adcf/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:18:03 user nova-compute[70975]: DEBUG nova.network.neutron [req-8da7c696-7ccd-4c38-9674-dba57e11e644 req-2a9894fb-8445-4efa-82b0-304000383258 service nova] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Updated VIF entry in instance network info cache for port 08164ae1-ace4-4d80-ad79-1741eacfa16e. {{(pid=70975) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 18 16:18:03 user nova-compute[70975]: DEBUG nova.network.neutron [req-8da7c696-7ccd-4c38-9674-dba57e11e644 req-2a9894fb-8445-4efa-82b0-304000383258 service nova] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Updating instance_info_cache with network_info: [{"id": "08164ae1-ace4-4d80-ad79-1741eacfa16e", "address": "fa:16:3e:28:00:5b", "network": {"id": "923d10dc-c67e-4426-9c6e-856e903e2446", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-131274818-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f516f5ec45ca4508841c77f79e8c038b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap08164ae1-ac", "ovs_interfaceid": "08164ae1-ace4-4d80-ad79-1741eacfa16e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:18:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0ad9c135-f279-4bd8-982d-65b45242adcf/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:18:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0ad9c135-f279-4bd8-982d-65b45242adcf/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:18:03 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-8da7c696-7ccd-4c38-9674-dba57e11e644 req-2a9894fb-8445-4efa-82b0-304000383258 service nova] Releasing lock "refresh_cache-6528f05a-9f05-4f35-b991-687e4f47029e" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:18:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0ad9c135-f279-4bd8-982d-65b45242adcf/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:18:04 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:04 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:18:04 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:18:04 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Hypervisor/Node resource view: name=user free_ram=8116MB free_disk=26.523357391357422GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70975) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 18 16:18:04 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:18:04 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:18:04 user nova-compute[70975]: DEBUG nova.compute.manager [req-2928e1eb-8680-4e16-abfb-63ed00edbe5c req-72f43896-8fbb-4732-afa0-6cce6412fa29 service nova] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Received event network-vif-plugged-08164ae1-ace4-4d80-ad79-1741eacfa16e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:18:04 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-2928e1eb-8680-4e16-abfb-63ed00edbe5c req-72f43896-8fbb-4732-afa0-6cce6412fa29 service nova] Acquiring lock "6528f05a-9f05-4f35-b991-687e4f47029e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:18:04 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-2928e1eb-8680-4e16-abfb-63ed00edbe5c req-72f43896-8fbb-4732-afa0-6cce6412fa29 service nova] Lock "6528f05a-9f05-4f35-b991-687e4f47029e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:18:04 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-2928e1eb-8680-4e16-abfb-63ed00edbe5c req-72f43896-8fbb-4732-afa0-6cce6412fa29 service nova] Lock "6528f05a-9f05-4f35-b991-687e4f47029e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:18:04 user nova-compute[70975]: DEBUG nova.compute.manager [req-2928e1eb-8680-4e16-abfb-63ed00edbe5c req-72f43896-8fbb-4732-afa0-6cce6412fa29 service nova] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] No waiting events found dispatching network-vif-plugged-08164ae1-ace4-4d80-ad79-1741eacfa16e {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:18:04 user nova-compute[70975]: WARNING nova.compute.manager [req-2928e1eb-8680-4e16-abfb-63ed00edbe5c req-72f43896-8fbb-4732-afa0-6cce6412fa29 service nova] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Received unexpected event network-vif-plugged-08164ae1-ace4-4d80-ad79-1741eacfa16e for instance with vm_state building and task_state spawning. Apr 18 16:18:04 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance da82d905-1ca1-403d-9598-7561e69b9704 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:18:04 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 1b530349-680e-4def-86ef-29c340efa175 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:18:04 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance d7a293bf-a9bd-424e-ba11-bbed7dfea41c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:18:04 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 6c592508-0444-4b42-a0b5-e3d8bd97f5ba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:18:04 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 8aaa4e97-9439-4760-9e05-8b248b02074f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:18:04 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 0ad9c135-f279-4bd8-982d-65b45242adcf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:18:04 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance e32196da-a530-4422-8566-5edb01f3cc62 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:18:04 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:18:04 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance b71bd3c1-da58-4cb0-abc3-650e11b9d4ce actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:18:04 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 6528f05a-9f05-4f35-b991-687e4f47029e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:18:04 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Total usable vcpus: 12, total allocated vcpus: 10 {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 18 16:18:04 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Final resource view: name=user phys_ram=16023MB used_ram=1792MB phys_disk=40GB used_disk=10GB total_vcpus=12 used_vcpus=10 pci_stats=[] {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 18 16:18:04 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:04 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:04 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:04 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:18:04 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:18:04 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:04 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Compute_service record updated for user:user {{(pid=70975) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 18 16:18:04 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.588s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:18:05 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:18:05 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:18:05 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Starting heal instance info cache {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 18 16:18:05 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "refresh_cache-d7a293bf-a9bd-424e-ba11-bbed7dfea41c" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:18:05 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquired lock "refresh_cache-d7a293bf-a9bd-424e-ba11-bbed7dfea41c" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:18:05 user nova-compute[70975]: DEBUG nova.network.neutron [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Forcefully refreshing network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 18 16:18:06 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Resumed> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:18:06 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] VM Resumed (Lifecycle Event) Apr 18 16:18:06 user nova-compute[70975]: DEBUG nova.compute.manager [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Instance event wait completed in 0 seconds for {{(pid=70975) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 18 16:18:06 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Guest created on hypervisor {{(pid=70975) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 18 16:18:06 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Instance spawned successfully. Apr 18 16:18:06 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 18 16:18:06 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:18:06 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:18:06 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:18:06 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Started> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:18:06 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] VM Started (Lifecycle Event) Apr 18 16:18:06 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Found default for hw_cdrom_bus of ide {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:18:06 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Found default for hw_disk_bus of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:18:06 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Found default for hw_input_bus of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:18:06 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Found default for hw_pointer_model of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:18:06 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Found default for hw_video_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:18:06 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Found default for hw_vif_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:18:06 user nova-compute[70975]: DEBUG nova.network.neutron [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Updating instance_info_cache with network_info: [{"id": "e5d69d5c-1a5c-4300-ab15-e73f78388f0e", "address": "fa:16:3e:92:2d:7f", "network": {"id": "1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1814061150-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "261e8ba82d9e4203917afb0241a3b4fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape5d69d5c-1a", "ovs_interfaceid": "e5d69d5c-1a5c-4300-ab15-e73f78388f0e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:18:06 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:18:06 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:18:06 user nova-compute[70975]: DEBUG nova.compute.manager [req-cda75f89-7eb7-46c2-ab25-5f0d240234f6 req-15a6278d-f3ce-4e2f-bbff-45e4c661d578 service nova] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Received event network-vif-plugged-08164ae1-ace4-4d80-ad79-1741eacfa16e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:18:06 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-cda75f89-7eb7-46c2-ab25-5f0d240234f6 req-15a6278d-f3ce-4e2f-bbff-45e4c661d578 service nova] Acquiring lock "6528f05a-9f05-4f35-b991-687e4f47029e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:18:06 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-cda75f89-7eb7-46c2-ab25-5f0d240234f6 req-15a6278d-f3ce-4e2f-bbff-45e4c661d578 service nova] Lock "6528f05a-9f05-4f35-b991-687e4f47029e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:18:06 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-cda75f89-7eb7-46c2-ab25-5f0d240234f6 req-15a6278d-f3ce-4e2f-bbff-45e4c661d578 service nova] Lock "6528f05a-9f05-4f35-b991-687e4f47029e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:18:06 user nova-compute[70975]: DEBUG nova.compute.manager [req-cda75f89-7eb7-46c2-ab25-5f0d240234f6 req-15a6278d-f3ce-4e2f-bbff-45e4c661d578 service nova] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] No waiting events found dispatching network-vif-plugged-08164ae1-ace4-4d80-ad79-1741eacfa16e {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:18:06 user nova-compute[70975]: WARNING nova.compute.manager [req-cda75f89-7eb7-46c2-ab25-5f0d240234f6 req-15a6278d-f3ce-4e2f-bbff-45e4c661d578 service nova] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Received unexpected event network-vif-plugged-08164ae1-ace4-4d80-ad79-1741eacfa16e for instance with vm_state building and task_state spawning. Apr 18 16:18:06 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Releasing lock "refresh_cache-d7a293bf-a9bd-424e-ba11-bbed7dfea41c" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:18:06 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Updated the network info_cache for instance {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 18 16:18:06 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:18:06 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:18:06 user nova-compute[70975]: INFO nova.compute.manager [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Took 5.69 seconds to spawn the instance on the hypervisor. Apr 18 16:18:06 user nova-compute[70975]: DEBUG nova.compute.manager [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:18:06 user nova-compute[70975]: INFO nova.compute.manager [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Took 6.42 seconds to build instance. Apr 18 16:18:06 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-af10cec7-6b35-4041-9935-8551a8bbafb8 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "6528f05a-9f05-4f35-b991-687e4f47029e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.522s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:18:07 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:07 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:12 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:12 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:22 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:22 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:27 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:27 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:29 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ad7d3288-c036-4534-831c-1ab19eedeeec tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Acquiring lock "0ad9c135-f279-4bd8-982d-65b45242adcf" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:18:29 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ad7d3288-c036-4534-831c-1ab19eedeeec tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lock "0ad9c135-f279-4bd8-982d-65b45242adcf" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.002s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:18:29 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ad7d3288-c036-4534-831c-1ab19eedeeec tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Acquiring lock "0ad9c135-f279-4bd8-982d-65b45242adcf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:18:29 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ad7d3288-c036-4534-831c-1ab19eedeeec tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lock "0ad9c135-f279-4bd8-982d-65b45242adcf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:18:29 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ad7d3288-c036-4534-831c-1ab19eedeeec tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lock "0ad9c135-f279-4bd8-982d-65b45242adcf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:18:29 user nova-compute[70975]: INFO nova.compute.manager [None req-ad7d3288-c036-4534-831c-1ab19eedeeec tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Terminating instance Apr 18 16:18:29 user nova-compute[70975]: DEBUG nova.compute.manager [None req-ad7d3288-c036-4534-831c-1ab19eedeeec tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Start destroying the instance on the hypervisor. {{(pid=70975) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 18 16:18:29 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:29 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:29 user nova-compute[70975]: DEBUG nova.compute.manager [req-81a1ad6a-7770-4240-a42c-3253bb953bbe req-5221d92e-54aa-45ed-83d3-5116faea9655 service nova] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Received event network-vif-unplugged-21586886-79a5-4cab-bcfe-b52b65fbf177 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:18:29 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-81a1ad6a-7770-4240-a42c-3253bb953bbe req-5221d92e-54aa-45ed-83d3-5116faea9655 service nova] Acquiring lock "0ad9c135-f279-4bd8-982d-65b45242adcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:18:29 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-81a1ad6a-7770-4240-a42c-3253bb953bbe req-5221d92e-54aa-45ed-83d3-5116faea9655 service nova] Lock "0ad9c135-f279-4bd8-982d-65b45242adcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:18:29 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-81a1ad6a-7770-4240-a42c-3253bb953bbe req-5221d92e-54aa-45ed-83d3-5116faea9655 service nova] Lock "0ad9c135-f279-4bd8-982d-65b45242adcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:18:29 user nova-compute[70975]: DEBUG nova.compute.manager [req-81a1ad6a-7770-4240-a42c-3253bb953bbe req-5221d92e-54aa-45ed-83d3-5116faea9655 service nova] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] No waiting events found dispatching network-vif-unplugged-21586886-79a5-4cab-bcfe-b52b65fbf177 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:18:29 user nova-compute[70975]: DEBUG nova.compute.manager [req-81a1ad6a-7770-4240-a42c-3253bb953bbe req-5221d92e-54aa-45ed-83d3-5116faea9655 service nova] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Received event network-vif-unplugged-21586886-79a5-4cab-bcfe-b52b65fbf177 for instance with task_state deleting. {{(pid=70975) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 18 16:18:29 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:29 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:30 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:30 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Instance destroyed successfully. Apr 18 16:18:30 user nova-compute[70975]: DEBUG nova.objects.instance [None req-ad7d3288-c036-4534-831c-1ab19eedeeec tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lazy-loading 'resources' on Instance uuid 0ad9c135-f279-4bd8-982d-65b45242adcf {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:18:30 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-ad7d3288-c036-4534-831c-1ab19eedeeec tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:16:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-995654785',display_name='tempest-VolumesAdminNegativeTest-server-995654785',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-995654785',id=10,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-18T16:16:43Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='8edf93a24e754e1ea58c0a7fd4f553dc',ramdisk_id='',reservation_id='r-3vh920j1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-VolumesAdminNegativeTest-2015888259',owner_user_name='tempest-VolumesAdminNegativeTest-2015888259-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-18T16:16:44Z,user_data=None,user_id='299ba2e202244f59a09e22df9ea8efe7',uuid=0ad9c135-f279-4bd8-982d-65b45242adcf,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "21586886-79a5-4cab-bcfe-b52b65fbf177", "address": "fa:16:3e:b1:99:de", "network": {"id": "0fafa4bf-5d8d-43f9-8cc3-ae6e7c58812c", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-891115046-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8edf93a24e754e1ea58c0a7fd4f553dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap21586886-79", "ovs_interfaceid": "21586886-79a5-4cab-bcfe-b52b65fbf177", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 18 16:18:30 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-ad7d3288-c036-4534-831c-1ab19eedeeec tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Converting VIF {"id": "21586886-79a5-4cab-bcfe-b52b65fbf177", "address": "fa:16:3e:b1:99:de", "network": {"id": "0fafa4bf-5d8d-43f9-8cc3-ae6e7c58812c", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-891115046-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8edf93a24e754e1ea58c0a7fd4f553dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap21586886-79", "ovs_interfaceid": "21586886-79a5-4cab-bcfe-b52b65fbf177", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:18:30 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-ad7d3288-c036-4534-831c-1ab19eedeeec tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:99:de,bridge_name='br-int',has_traffic_filtering=True,id=21586886-79a5-4cab-bcfe-b52b65fbf177,network=Network(0fafa4bf-5d8d-43f9-8cc3-ae6e7c58812c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21586886-79') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:18:30 user nova-compute[70975]: DEBUG os_vif [None req-ad7d3288-c036-4534-831c-1ab19eedeeec tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:99:de,bridge_name='br-int',has_traffic_filtering=True,id=21586886-79a5-4cab-bcfe-b52b65fbf177,network=Network(0fafa4bf-5d8d-43f9-8cc3-ae6e7c58812c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21586886-79') {{(pid=70975) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 18 16:18:30 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:30 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap21586886-79, bridge=br-int, if_exists=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:18:30 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:30 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:18:30 user nova-compute[70975]: INFO os_vif [None req-ad7d3288-c036-4534-831c-1ab19eedeeec tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:99:de,bridge_name='br-int',has_traffic_filtering=True,id=21586886-79a5-4cab-bcfe-b52b65fbf177,network=Network(0fafa4bf-5d8d-43f9-8cc3-ae6e7c58812c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21586886-79') Apr 18 16:18:30 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-ad7d3288-c036-4534-831c-1ab19eedeeec tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Deleting instance files /opt/stack/data/nova/instances/0ad9c135-f279-4bd8-982d-65b45242adcf_del Apr 18 16:18:30 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-ad7d3288-c036-4534-831c-1ab19eedeeec tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Deletion of /opt/stack/data/nova/instances/0ad9c135-f279-4bd8-982d-65b45242adcf_del complete Apr 18 16:18:30 user nova-compute[70975]: INFO nova.compute.manager [None req-ad7d3288-c036-4534-831c-1ab19eedeeec tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Took 0.66 seconds to destroy the instance on the hypervisor. Apr 18 16:18:30 user nova-compute[70975]: DEBUG oslo.service.loopingcall [None req-ad7d3288-c036-4534-831c-1ab19eedeeec tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70975) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 18 16:18:30 user nova-compute[70975]: DEBUG nova.compute.manager [-] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Deallocating network for instance {{(pid=70975) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 18 16:18:30 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] deallocate_for_instance() {{(pid=70975) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 18 16:18:30 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Updating instance_info_cache with network_info: [] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:18:30 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Took 0.47 seconds to deallocate network for instance. Apr 18 16:18:30 user nova-compute[70975]: DEBUG nova.compute.manager [req-8b785bfc-a0e1-4662-8faa-ac1c2c8833f0 req-aa3731ab-41db-420d-900d-dc64e85c2068 service nova] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Received event network-vif-deleted-21586886-79a5-4cab-bcfe-b52b65fbf177 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:18:30 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ad7d3288-c036-4534-831c-1ab19eedeeec tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:18:30 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ad7d3288-c036-4534-831c-1ab19eedeeec tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:18:31 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-ad7d3288-c036-4534-831c-1ab19eedeeec tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:18:31 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-ad7d3288-c036-4534-831c-1ab19eedeeec tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:18:31 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ad7d3288-c036-4534-831c-1ab19eedeeec tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.374s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:18:31 user nova-compute[70975]: INFO nova.scheduler.client.report [None req-ad7d3288-c036-4534-831c-1ab19eedeeec tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Deleted allocations for instance 0ad9c135-f279-4bd8-982d-65b45242adcf Apr 18 16:18:31 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ad7d3288-c036-4534-831c-1ab19eedeeec tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lock "0ad9c135-f279-4bd8-982d-65b45242adcf" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.750s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:18:31 user nova-compute[70975]: DEBUG nova.compute.manager [req-c31293f5-7957-4457-bf4a-f6324dddd89b req-f432a9b9-f6b6-4e3b-a739-81e2ee42067f service nova] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Received event network-vif-plugged-21586886-79a5-4cab-bcfe-b52b65fbf177 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:18:31 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-c31293f5-7957-4457-bf4a-f6324dddd89b req-f432a9b9-f6b6-4e3b-a739-81e2ee42067f service nova] Acquiring lock "0ad9c135-f279-4bd8-982d-65b45242adcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:18:31 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-c31293f5-7957-4457-bf4a-f6324dddd89b req-f432a9b9-f6b6-4e3b-a739-81e2ee42067f service nova] Lock "0ad9c135-f279-4bd8-982d-65b45242adcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:18:31 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-c31293f5-7957-4457-bf4a-f6324dddd89b req-f432a9b9-f6b6-4e3b-a739-81e2ee42067f service nova] Lock "0ad9c135-f279-4bd8-982d-65b45242adcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:18:31 user nova-compute[70975]: DEBUG nova.compute.manager [req-c31293f5-7957-4457-bf4a-f6324dddd89b req-f432a9b9-f6b6-4e3b-a739-81e2ee42067f service nova] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] No waiting events found dispatching network-vif-plugged-21586886-79a5-4cab-bcfe-b52b65fbf177 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:18:31 user nova-compute[70975]: WARNING nova.compute.manager [req-c31293f5-7957-4457-bf4a-f6324dddd89b req-f432a9b9-f6b6-4e3b-a739-81e2ee42067f service nova] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Received unexpected event network-vif-plugged-21586886-79a5-4cab-bcfe-b52b65fbf177 for instance with vm_state deleted and task_state None. Apr 18 16:18:32 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:35 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:37 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:40 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:42 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:43 user nova-compute[70975]: DEBUG nova.compute.manager [req-d3d0e283-0d58-428c-823a-09b4f2a7250b req-0af84aa7-7dec-4f5d-a480-a6582f9f6cf7 service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Received event network-changed-203a232c-488a-427e-bf18-e99feec680b6 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:18:43 user nova-compute[70975]: DEBUG nova.compute.manager [req-d3d0e283-0d58-428c-823a-09b4f2a7250b req-0af84aa7-7dec-4f5d-a480-a6582f9f6cf7 service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Refreshing instance network info cache due to event network-changed-203a232c-488a-427e-bf18-e99feec680b6. {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 18 16:18:43 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-d3d0e283-0d58-428c-823a-09b4f2a7250b req-0af84aa7-7dec-4f5d-a480-a6582f9f6cf7 service nova] Acquiring lock "refresh_cache-e32196da-a530-4422-8566-5edb01f3cc62" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:18:43 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-d3d0e283-0d58-428c-823a-09b4f2a7250b req-0af84aa7-7dec-4f5d-a480-a6582f9f6cf7 service nova] Acquired lock "refresh_cache-e32196da-a530-4422-8566-5edb01f3cc62" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:18:43 user nova-compute[70975]: DEBUG nova.network.neutron [req-d3d0e283-0d58-428c-823a-09b4f2a7250b req-0af84aa7-7dec-4f5d-a480-a6582f9f6cf7 service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Refreshing network info cache for port 203a232c-488a-427e-bf18-e99feec680b6 {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 18 16:18:44 user nova-compute[70975]: DEBUG nova.network.neutron [req-d3d0e283-0d58-428c-823a-09b4f2a7250b req-0af84aa7-7dec-4f5d-a480-a6582f9f6cf7 service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Updated VIF entry in instance network info cache for port 203a232c-488a-427e-bf18-e99feec680b6. {{(pid=70975) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 18 16:18:44 user nova-compute[70975]: DEBUG nova.network.neutron [req-d3d0e283-0d58-428c-823a-09b4f2a7250b req-0af84aa7-7dec-4f5d-a480-a6582f9f6cf7 service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Updating instance_info_cache with network_info: [{"id": "203a232c-488a-427e-bf18-e99feec680b6", "address": "fa:16:3e:e2:df:e9", "network": {"id": "51cddd0f-0e4b-4d37-be40-ce5592263bc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1803491920-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.70", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f9987eeaa6b24ca48e80e8d5318f02ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap203a232c-48", "ovs_interfaceid": "203a232c-488a-427e-bf18-e99feec680b6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:18:44 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-d3d0e283-0d58-428c-823a-09b4f2a7250b req-0af84aa7-7dec-4f5d-a480-a6582f9f6cf7 service nova] Releasing lock "refresh_cache-e32196da-a530-4422-8566-5edb01f3cc62" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:18:45 user nova-compute[70975]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:18:45 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] VM Stopped (Lifecycle Event) Apr 18 16:18:45 user nova-compute[70975]: DEBUG nova.compute.manager [None req-4bcdf7fa-e85a-4e6c-9460-03d7d7e68c7e None None] [instance: 0ad9c135-f279-4bd8-982d-65b45242adcf] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:18:45 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:45 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-b0ab199c-5f6f-4c4d-8aae-ff46deaa4ee6 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Acquiring lock "e32196da-a530-4422-8566-5edb01f3cc62" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:18:45 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-b0ab199c-5f6f-4c4d-8aae-ff46deaa4ee6 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lock "e32196da-a530-4422-8566-5edb01f3cc62" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:18:45 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-b0ab199c-5f6f-4c4d-8aae-ff46deaa4ee6 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Acquiring lock "e32196da-a530-4422-8566-5edb01f3cc62-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:18:45 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-b0ab199c-5f6f-4c4d-8aae-ff46deaa4ee6 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lock "e32196da-a530-4422-8566-5edb01f3cc62-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:18:45 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-b0ab199c-5f6f-4c4d-8aae-ff46deaa4ee6 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lock "e32196da-a530-4422-8566-5edb01f3cc62-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:18:45 user nova-compute[70975]: INFO nova.compute.manager [None req-b0ab199c-5f6f-4c4d-8aae-ff46deaa4ee6 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Terminating instance Apr 18 16:18:45 user nova-compute[70975]: DEBUG nova.compute.manager [None req-b0ab199c-5f6f-4c4d-8aae-ff46deaa4ee6 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Start destroying the instance on the hypervisor. {{(pid=70975) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 18 16:18:45 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:45 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:45 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:45 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:45 user nova-compute[70975]: DEBUG nova.compute.manager [req-3ae5d5a6-4afe-417a-86fa-6e8f82d57e9f req-bab7f724-643c-48c5-9dd6-21fc3e5919ed service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Received event network-vif-unplugged-203a232c-488a-427e-bf18-e99feec680b6 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:18:45 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-3ae5d5a6-4afe-417a-86fa-6e8f82d57e9f req-bab7f724-643c-48c5-9dd6-21fc3e5919ed service nova] Acquiring lock "e32196da-a530-4422-8566-5edb01f3cc62-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:18:45 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-3ae5d5a6-4afe-417a-86fa-6e8f82d57e9f req-bab7f724-643c-48c5-9dd6-21fc3e5919ed service nova] Lock "e32196da-a530-4422-8566-5edb01f3cc62-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:18:45 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-3ae5d5a6-4afe-417a-86fa-6e8f82d57e9f req-bab7f724-643c-48c5-9dd6-21fc3e5919ed service nova] Lock "e32196da-a530-4422-8566-5edb01f3cc62-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:18:45 user nova-compute[70975]: DEBUG nova.compute.manager [req-3ae5d5a6-4afe-417a-86fa-6e8f82d57e9f req-bab7f724-643c-48c5-9dd6-21fc3e5919ed service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] No waiting events found dispatching network-vif-unplugged-203a232c-488a-427e-bf18-e99feec680b6 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:18:45 user nova-compute[70975]: DEBUG nova.compute.manager [req-3ae5d5a6-4afe-417a-86fa-6e8f82d57e9f req-bab7f724-643c-48c5-9dd6-21fc3e5919ed service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Received event network-vif-unplugged-203a232c-488a-427e-bf18-e99feec680b6 for instance with task_state deleting. {{(pid=70975) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 18 16:18:46 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:46 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:46 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:46 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Instance destroyed successfully. Apr 18 16:18:46 user nova-compute[70975]: DEBUG nova.objects.instance [None req-b0ab199c-5f6f-4c4d-8aae-ff46deaa4ee6 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lazy-loading 'resources' on Instance uuid e32196da-a530-4422-8566-5edb01f3cc62 {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:18:46 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-b0ab199c-5f6f-4c4d-8aae-ff46deaa4ee6 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:16:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1520665803',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-1520665803',id=11,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIM5ALzjyybYMOK2Z1cVTnyDj3Z+LX/Xt2LMBmK17WbJDSTepQBn453Oo4oGngedtyHHoL/jHz286S3ijelgC//rOYsCgxrKNn3otRSI8UvONPGZU5icbqSs6c6+xBe3GQ==',key_name='tempest-keypair-48834850',keypairs=,launch_index=0,launched_at=2023-04-18T16:16:59Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='f9987eeaa6b24ca48e80e8d5318f02ac',ramdisk_id='',reservation_id='r-s3q9g9uz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeShelveTestJSON-1663710151',owner_user_name='tempest-AttachVolumeShelveTestJSON-1663710151-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-18T16:16:59Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='73a99bbf510f4f67bb7a35901ba3edc5',uuid=e32196da-a530-4422-8566-5edb01f3cc62,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "203a232c-488a-427e-bf18-e99feec680b6", "address": "fa:16:3e:e2:df:e9", "network": {"id": "51cddd0f-0e4b-4d37-be40-ce5592263bc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1803491920-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.70", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f9987eeaa6b24ca48e80e8d5318f02ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap203a232c-48", "ovs_interfaceid": "203a232c-488a-427e-bf18-e99feec680b6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 18 16:18:46 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-b0ab199c-5f6f-4c4d-8aae-ff46deaa4ee6 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Converting VIF {"id": "203a232c-488a-427e-bf18-e99feec680b6", "address": "fa:16:3e:e2:df:e9", "network": {"id": "51cddd0f-0e4b-4d37-be40-ce5592263bc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1803491920-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.70", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f9987eeaa6b24ca48e80e8d5318f02ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap203a232c-48", "ovs_interfaceid": "203a232c-488a-427e-bf18-e99feec680b6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:18:46 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-b0ab199c-5f6f-4c4d-8aae-ff46deaa4ee6 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e2:df:e9,bridge_name='br-int',has_traffic_filtering=True,id=203a232c-488a-427e-bf18-e99feec680b6,network=Network(51cddd0f-0e4b-4d37-be40-ce5592263bc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap203a232c-48') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:18:46 user nova-compute[70975]: DEBUG os_vif [None req-b0ab199c-5f6f-4c4d-8aae-ff46deaa4ee6 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e2:df:e9,bridge_name='br-int',has_traffic_filtering=True,id=203a232c-488a-427e-bf18-e99feec680b6,network=Network(51cddd0f-0e4b-4d37-be40-ce5592263bc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap203a232c-48') {{(pid=70975) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 18 16:18:46 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:46 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap203a232c-48, bridge=br-int, if_exists=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:18:46 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:46 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:46 user nova-compute[70975]: INFO os_vif [None req-b0ab199c-5f6f-4c4d-8aae-ff46deaa4ee6 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e2:df:e9,bridge_name='br-int',has_traffic_filtering=True,id=203a232c-488a-427e-bf18-e99feec680b6,network=Network(51cddd0f-0e4b-4d37-be40-ce5592263bc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap203a232c-48') Apr 18 16:18:46 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-b0ab199c-5f6f-4c4d-8aae-ff46deaa4ee6 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Deleting instance files /opt/stack/data/nova/instances/e32196da-a530-4422-8566-5edb01f3cc62_del Apr 18 16:18:46 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-b0ab199c-5f6f-4c4d-8aae-ff46deaa4ee6 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Deletion of /opt/stack/data/nova/instances/e32196da-a530-4422-8566-5edb01f3cc62_del complete Apr 18 16:18:46 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:46 user nova-compute[70975]: INFO nova.compute.manager [None req-b0ab199c-5f6f-4c4d-8aae-ff46deaa4ee6 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Took 0.86 seconds to destroy the instance on the hypervisor. Apr 18 16:18:46 user nova-compute[70975]: DEBUG oslo.service.loopingcall [None req-b0ab199c-5f6f-4c4d-8aae-ff46deaa4ee6 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70975) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 18 16:18:46 user nova-compute[70975]: DEBUG nova.compute.manager [-] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Deallocating network for instance {{(pid=70975) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 18 16:18:46 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: e32196da-a530-4422-8566-5edb01f3cc62] deallocate_for_instance() {{(pid=70975) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 18 16:18:46 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:46 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:47 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:47 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:47 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:47 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Updating instance_info_cache with network_info: [] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:18:47 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Took 0.83 seconds to deallocate network for instance. Apr 18 16:18:47 user nova-compute[70975]: DEBUG nova.compute.manager [req-1a82560d-84b3-4a46-b59c-255501952133 req-1c684939-e8e4-4e9b-ae0a-00ce889a0b3c service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Received event network-vif-deleted-203a232c-488a-427e-bf18-e99feec680b6 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:18:47 user nova-compute[70975]: INFO nova.compute.manager [req-1a82560d-84b3-4a46-b59c-255501952133 req-1c684939-e8e4-4e9b-ae0a-00ce889a0b3c service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Neutron deleted interface 203a232c-488a-427e-bf18-e99feec680b6; detaching it from the instance and deleting it from the info cache Apr 18 16:18:47 user nova-compute[70975]: DEBUG nova.network.neutron [req-1a82560d-84b3-4a46-b59c-255501952133 req-1c684939-e8e4-4e9b-ae0a-00ce889a0b3c service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Updating instance_info_cache with network_info: [] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:18:47 user nova-compute[70975]: DEBUG nova.compute.manager [req-1a82560d-84b3-4a46-b59c-255501952133 req-1c684939-e8e4-4e9b-ae0a-00ce889a0b3c service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Detach interface failed, port_id=203a232c-488a-427e-bf18-e99feec680b6, reason: Instance e32196da-a530-4422-8566-5edb01f3cc62 could not be found. {{(pid=70975) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 18 16:18:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-b0ab199c-5f6f-4c4d-8aae-ff46deaa4ee6 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:18:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-b0ab199c-5f6f-4c4d-8aae-ff46deaa4ee6 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:18:47 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:47 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-b0ab199c-5f6f-4c4d-8aae-ff46deaa4ee6 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:18:47 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-b0ab199c-5f6f-4c4d-8aae-ff46deaa4ee6 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:18:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-b0ab199c-5f6f-4c4d-8aae-ff46deaa4ee6 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.292s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:18:47 user nova-compute[70975]: INFO nova.scheduler.client.report [None req-b0ab199c-5f6f-4c4d-8aae-ff46deaa4ee6 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Deleted allocations for instance e32196da-a530-4422-8566-5edb01f3cc62 Apr 18 16:18:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-b0ab199c-5f6f-4c4d-8aae-ff46deaa4ee6 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lock "e32196da-a530-4422-8566-5edb01f3cc62" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.151s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:18:48 user nova-compute[70975]: DEBUG nova.compute.manager [req-68aa021e-d05f-4c81-83cf-16da192d28af req-d7a5a978-3727-44ac-87da-e9d6d3205d6c service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Received event network-vif-plugged-203a232c-488a-427e-bf18-e99feec680b6 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:18:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-68aa021e-d05f-4c81-83cf-16da192d28af req-d7a5a978-3727-44ac-87da-e9d6d3205d6c service nova] Acquiring lock "e32196da-a530-4422-8566-5edb01f3cc62-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:18:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-68aa021e-d05f-4c81-83cf-16da192d28af req-d7a5a978-3727-44ac-87da-e9d6d3205d6c service nova] Lock "e32196da-a530-4422-8566-5edb01f3cc62-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:18:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-68aa021e-d05f-4c81-83cf-16da192d28af req-d7a5a978-3727-44ac-87da-e9d6d3205d6c service nova] Lock "e32196da-a530-4422-8566-5edb01f3cc62-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:18:48 user nova-compute[70975]: DEBUG nova.compute.manager [req-68aa021e-d05f-4c81-83cf-16da192d28af req-d7a5a978-3727-44ac-87da-e9d6d3205d6c service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] No waiting events found dispatching network-vif-plugged-203a232c-488a-427e-bf18-e99feec680b6 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:18:48 user nova-compute[70975]: WARNING nova.compute.manager [req-68aa021e-d05f-4c81-83cf-16da192d28af req-d7a5a978-3727-44ac-87da-e9d6d3205d6c service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Received unexpected event network-vif-plugged-203a232c-488a-427e-bf18-e99feec680b6 for instance with vm_state deleted and task_state None. Apr 18 16:18:48 user nova-compute[70975]: DEBUG nova.compute.manager [req-68aa021e-d05f-4c81-83cf-16da192d28af req-d7a5a978-3727-44ac-87da-e9d6d3205d6c service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Received event network-vif-plugged-203a232c-488a-427e-bf18-e99feec680b6 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:18:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-68aa021e-d05f-4c81-83cf-16da192d28af req-d7a5a978-3727-44ac-87da-e9d6d3205d6c service nova] Acquiring lock "e32196da-a530-4422-8566-5edb01f3cc62-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:18:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-68aa021e-d05f-4c81-83cf-16da192d28af req-d7a5a978-3727-44ac-87da-e9d6d3205d6c service nova] Lock "e32196da-a530-4422-8566-5edb01f3cc62-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:18:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-68aa021e-d05f-4c81-83cf-16da192d28af req-d7a5a978-3727-44ac-87da-e9d6d3205d6c service nova] Lock "e32196da-a530-4422-8566-5edb01f3cc62-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:18:48 user nova-compute[70975]: DEBUG nova.compute.manager [req-68aa021e-d05f-4c81-83cf-16da192d28af req-d7a5a978-3727-44ac-87da-e9d6d3205d6c service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] No waiting events found dispatching network-vif-plugged-203a232c-488a-427e-bf18-e99feec680b6 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:18:48 user nova-compute[70975]: WARNING nova.compute.manager [req-68aa021e-d05f-4c81-83cf-16da192d28af req-d7a5a978-3727-44ac-87da-e9d6d3205d6c service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Received unexpected event network-vif-plugged-203a232c-488a-427e-bf18-e99feec680b6 for instance with vm_state deleted and task_state None. Apr 18 16:18:48 user nova-compute[70975]: DEBUG nova.compute.manager [req-68aa021e-d05f-4c81-83cf-16da192d28af req-d7a5a978-3727-44ac-87da-e9d6d3205d6c service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Received event network-vif-plugged-203a232c-488a-427e-bf18-e99feec680b6 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:18:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-68aa021e-d05f-4c81-83cf-16da192d28af req-d7a5a978-3727-44ac-87da-e9d6d3205d6c service nova] Acquiring lock "e32196da-a530-4422-8566-5edb01f3cc62-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:18:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-68aa021e-d05f-4c81-83cf-16da192d28af req-d7a5a978-3727-44ac-87da-e9d6d3205d6c service nova] Lock "e32196da-a530-4422-8566-5edb01f3cc62-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:18:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-68aa021e-d05f-4c81-83cf-16da192d28af req-d7a5a978-3727-44ac-87da-e9d6d3205d6c service nova] Lock "e32196da-a530-4422-8566-5edb01f3cc62-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:18:48 user nova-compute[70975]: DEBUG nova.compute.manager [req-68aa021e-d05f-4c81-83cf-16da192d28af req-d7a5a978-3727-44ac-87da-e9d6d3205d6c service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] No waiting events found dispatching network-vif-plugged-203a232c-488a-427e-bf18-e99feec680b6 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:18:48 user nova-compute[70975]: WARNING nova.compute.manager [req-68aa021e-d05f-4c81-83cf-16da192d28af req-d7a5a978-3727-44ac-87da-e9d6d3205d6c service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Received unexpected event network-vif-plugged-203a232c-488a-427e-bf18-e99feec680b6 for instance with vm_state deleted and task_state None. Apr 18 16:18:48 user nova-compute[70975]: DEBUG nova.compute.manager [req-68aa021e-d05f-4c81-83cf-16da192d28af req-d7a5a978-3727-44ac-87da-e9d6d3205d6c service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Received event network-vif-unplugged-203a232c-488a-427e-bf18-e99feec680b6 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:18:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-68aa021e-d05f-4c81-83cf-16da192d28af req-d7a5a978-3727-44ac-87da-e9d6d3205d6c service nova] Acquiring lock "e32196da-a530-4422-8566-5edb01f3cc62-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:18:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-68aa021e-d05f-4c81-83cf-16da192d28af req-d7a5a978-3727-44ac-87da-e9d6d3205d6c service nova] Lock "e32196da-a530-4422-8566-5edb01f3cc62-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:18:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-68aa021e-d05f-4c81-83cf-16da192d28af req-d7a5a978-3727-44ac-87da-e9d6d3205d6c service nova] Lock "e32196da-a530-4422-8566-5edb01f3cc62-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:18:48 user nova-compute[70975]: DEBUG nova.compute.manager [req-68aa021e-d05f-4c81-83cf-16da192d28af req-d7a5a978-3727-44ac-87da-e9d6d3205d6c service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] No waiting events found dispatching network-vif-unplugged-203a232c-488a-427e-bf18-e99feec680b6 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:18:48 user nova-compute[70975]: WARNING nova.compute.manager [req-68aa021e-d05f-4c81-83cf-16da192d28af req-d7a5a978-3727-44ac-87da-e9d6d3205d6c service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Received unexpected event network-vif-unplugged-203a232c-488a-427e-bf18-e99feec680b6 for instance with vm_state deleted and task_state None. Apr 18 16:18:48 user nova-compute[70975]: DEBUG nova.compute.manager [req-68aa021e-d05f-4c81-83cf-16da192d28af req-d7a5a978-3727-44ac-87da-e9d6d3205d6c service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Received event network-vif-plugged-203a232c-488a-427e-bf18-e99feec680b6 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:18:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-68aa021e-d05f-4c81-83cf-16da192d28af req-d7a5a978-3727-44ac-87da-e9d6d3205d6c service nova] Acquiring lock "e32196da-a530-4422-8566-5edb01f3cc62-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:18:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-68aa021e-d05f-4c81-83cf-16da192d28af req-d7a5a978-3727-44ac-87da-e9d6d3205d6c service nova] Lock "e32196da-a530-4422-8566-5edb01f3cc62-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:18:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-68aa021e-d05f-4c81-83cf-16da192d28af req-d7a5a978-3727-44ac-87da-e9d6d3205d6c service nova] Lock "e32196da-a530-4422-8566-5edb01f3cc62-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:18:48 user nova-compute[70975]: DEBUG nova.compute.manager [req-68aa021e-d05f-4c81-83cf-16da192d28af req-d7a5a978-3727-44ac-87da-e9d6d3205d6c service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] No waiting events found dispatching network-vif-plugged-203a232c-488a-427e-bf18-e99feec680b6 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:18:48 user nova-compute[70975]: WARNING nova.compute.manager [req-68aa021e-d05f-4c81-83cf-16da192d28af req-d7a5a978-3727-44ac-87da-e9d6d3205d6c service nova] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Received unexpected event network-vif-plugged-203a232c-488a-427e-bf18-e99feec680b6 for instance with vm_state deleted and task_state None. Apr 18 16:18:50 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-42975aef-8463-40b6-9b5e-5a79d07262bb tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Acquiring lock "8aaa4e97-9439-4760-9e05-8b248b02074f" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:18:50 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-42975aef-8463-40b6-9b5e-5a79d07262bb tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Lock "8aaa4e97-9439-4760-9e05-8b248b02074f" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:18:50 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-42975aef-8463-40b6-9b5e-5a79d07262bb tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Acquiring lock "8aaa4e97-9439-4760-9e05-8b248b02074f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:18:50 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-42975aef-8463-40b6-9b5e-5a79d07262bb tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Lock "8aaa4e97-9439-4760-9e05-8b248b02074f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:18:50 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-42975aef-8463-40b6-9b5e-5a79d07262bb tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Lock "8aaa4e97-9439-4760-9e05-8b248b02074f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:18:50 user nova-compute[70975]: INFO nova.compute.manager [None req-42975aef-8463-40b6-9b5e-5a79d07262bb tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Terminating instance Apr 18 16:18:50 user nova-compute[70975]: DEBUG nova.compute.manager [None req-42975aef-8463-40b6-9b5e-5a79d07262bb tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Start destroying the instance on the hypervisor. {{(pid=70975) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 18 16:18:50 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:50 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:50 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:50 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:50 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:51 user nova-compute[70975]: DEBUG nova.compute.manager [req-f07a24e1-f14b-4f98-9e28-9ae32e18264d req-af49145e-b2b4-47c8-8636-f22d15664247 service nova] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Received event network-vif-unplugged-8029e455-c16d-48cd-93e1-cf56c226cc4a {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:18:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-f07a24e1-f14b-4f98-9e28-9ae32e18264d req-af49145e-b2b4-47c8-8636-f22d15664247 service nova] Acquiring lock "8aaa4e97-9439-4760-9e05-8b248b02074f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:18:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-f07a24e1-f14b-4f98-9e28-9ae32e18264d req-af49145e-b2b4-47c8-8636-f22d15664247 service nova] Lock "8aaa4e97-9439-4760-9e05-8b248b02074f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:18:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-f07a24e1-f14b-4f98-9e28-9ae32e18264d req-af49145e-b2b4-47c8-8636-f22d15664247 service nova] Lock "8aaa4e97-9439-4760-9e05-8b248b02074f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:18:51 user nova-compute[70975]: DEBUG nova.compute.manager [req-f07a24e1-f14b-4f98-9e28-9ae32e18264d req-af49145e-b2b4-47c8-8636-f22d15664247 service nova] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] No waiting events found dispatching network-vif-unplugged-8029e455-c16d-48cd-93e1-cf56c226cc4a {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:18:51 user nova-compute[70975]: DEBUG nova.compute.manager [req-f07a24e1-f14b-4f98-9e28-9ae32e18264d req-af49145e-b2b4-47c8-8636-f22d15664247 service nova] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Received event network-vif-unplugged-8029e455-c16d-48cd-93e1-cf56c226cc4a for instance with task_state deleting. {{(pid=70975) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 18 16:18:51 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Instance destroyed successfully. Apr 18 16:18:51 user nova-compute[70975]: DEBUG nova.objects.instance [None req-42975aef-8463-40b6-9b5e-5a79d07262bb tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Lazy-loading 'resources' on Instance uuid 8aaa4e97-9439-4760-9e05-8b248b02074f {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:18:51 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-42975aef-8463-40b6-9b5e-5a79d07262bb tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:15:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1796838032',display_name='tempest-ServerStableDeviceRescueTest-server-1796838032',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverstabledevicerescuetest-server-1796838032',id=9,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMRdR0cjFHm3mhHSll5gh7yZMFO8YnbHGZrzqn4BUKzi/NqN6epqJPxISmge123Mh6ultuf3msUKM4SPDGPvR5esoWMysquk2JzsFDlVx2V3n3YOLa1rlzu338dq4Z9bHg==',key_name='tempest-keypair-1985020567',keypairs=,launch_index=0,launched_at=2023-04-18T16:15:11Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='e6fc24a9e1b646a2a08df4f53f712267',ramdisk_id='',reservation_id='r-87uejcsr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerStableDeviceRescueTest-1233154848',owner_user_name='tempest-ServerStableDeviceRescueTest-1233154848-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-18T16:17:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6a284b1ad50e463894f8d58d38a57d7c',uuid=8aaa4e97-9439-4760-9e05-8b248b02074f,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8029e455-c16d-48cd-93e1-cf56c226cc4a", "address": "fa:16:3e:38:a4:82", "network": {"id": "7692c2b5-931d-4d1d-aae6-384ce4ff5ff0", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-144924554-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.121", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e6fc24a9e1b646a2a08df4f53f712267", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8029e455-c1", "ovs_interfaceid": "8029e455-c16d-48cd-93e1-cf56c226cc4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 18 16:18:51 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-42975aef-8463-40b6-9b5e-5a79d07262bb tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Converting VIF {"id": "8029e455-c16d-48cd-93e1-cf56c226cc4a", "address": "fa:16:3e:38:a4:82", "network": {"id": "7692c2b5-931d-4d1d-aae6-384ce4ff5ff0", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-144924554-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.121", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e6fc24a9e1b646a2a08df4f53f712267", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8029e455-c1", "ovs_interfaceid": "8029e455-c16d-48cd-93e1-cf56c226cc4a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:18:51 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-42975aef-8463-40b6-9b5e-5a79d07262bb tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:38:a4:82,bridge_name='br-int',has_traffic_filtering=True,id=8029e455-c16d-48cd-93e1-cf56c226cc4a,network=Network(7692c2b5-931d-4d1d-aae6-384ce4ff5ff0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8029e455-c1') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:18:51 user nova-compute[70975]: DEBUG os_vif [None req-42975aef-8463-40b6-9b5e-5a79d07262bb tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:38:a4:82,bridge_name='br-int',has_traffic_filtering=True,id=8029e455-c16d-48cd-93e1-cf56c226cc4a,network=Network(7692c2b5-931d-4d1d-aae6-384ce4ff5ff0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8029e455-c1') {{(pid=70975) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 18 16:18:51 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:51 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8029e455-c1, bridge=br-int, if_exists=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:18:51 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:51 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:18:51 user nova-compute[70975]: INFO os_vif [None req-42975aef-8463-40b6-9b5e-5a79d07262bb tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:38:a4:82,bridge_name='br-int',has_traffic_filtering=True,id=8029e455-c16d-48cd-93e1-cf56c226cc4a,network=Network(7692c2b5-931d-4d1d-aae6-384ce4ff5ff0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8029e455-c1') Apr 18 16:18:51 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-42975aef-8463-40b6-9b5e-5a79d07262bb tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Deleting instance files /opt/stack/data/nova/instances/8aaa4e97-9439-4760-9e05-8b248b02074f_del Apr 18 16:18:51 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-42975aef-8463-40b6-9b5e-5a79d07262bb tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Deletion of /opt/stack/data/nova/instances/8aaa4e97-9439-4760-9e05-8b248b02074f_del complete Apr 18 16:18:51 user nova-compute[70975]: INFO nova.compute.manager [None req-42975aef-8463-40b6-9b5e-5a79d07262bb tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Took 0.64 seconds to destroy the instance on the hypervisor. Apr 18 16:18:51 user nova-compute[70975]: DEBUG oslo.service.loopingcall [None req-42975aef-8463-40b6-9b5e-5a79d07262bb tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70975) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 18 16:18:51 user nova-compute[70975]: DEBUG nova.compute.manager [-] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Deallocating network for instance {{(pid=70975) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 18 16:18:51 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] deallocate_for_instance() {{(pid=70975) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 18 16:18:51 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Updating instance_info_cache with network_info: [] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:18:51 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Took 0.64 seconds to deallocate network for instance. Apr 18 16:18:51 user nova-compute[70975]: DEBUG nova.compute.manager [req-e1db0a14-d34f-4a04-a1b6-b3726470131c req-7092d2c5-ed98-4274-9f1f-b75649834842 service nova] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Received event network-vif-deleted-8029e455-c16d-48cd-93e1-cf56c226cc4a {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:18:51 user nova-compute[70975]: INFO nova.compute.manager [req-e1db0a14-d34f-4a04-a1b6-b3726470131c req-7092d2c5-ed98-4274-9f1f-b75649834842 service nova] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Neutron deleted interface 8029e455-c16d-48cd-93e1-cf56c226cc4a; detaching it from the instance and deleting it from the info cache Apr 18 16:18:51 user nova-compute[70975]: DEBUG nova.network.neutron [req-e1db0a14-d34f-4a04-a1b6-b3726470131c req-7092d2c5-ed98-4274-9f1f-b75649834842 service nova] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Updating instance_info_cache with network_info: [] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:18:51 user nova-compute[70975]: DEBUG nova.compute.manager [req-e1db0a14-d34f-4a04-a1b6-b3726470131c req-7092d2c5-ed98-4274-9f1f-b75649834842 service nova] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Detach interface failed, port_id=8029e455-c16d-48cd-93e1-cf56c226cc4a, reason: Instance 8aaa4e97-9439-4760-9e05-8b248b02074f could not be found. {{(pid=70975) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 18 16:18:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-42975aef-8463-40b6-9b5e-5a79d07262bb tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:18:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-42975aef-8463-40b6-9b5e-5a79d07262bb tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:18:52 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-42975aef-8463-40b6-9b5e-5a79d07262bb tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:18:52 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-42975aef-8463-40b6-9b5e-5a79d07262bb tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:18:52 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-42975aef-8463-40b6-9b5e-5a79d07262bb tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.257s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:18:52 user nova-compute[70975]: INFO nova.scheduler.client.report [None req-42975aef-8463-40b6-9b5e-5a79d07262bb tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Deleted allocations for instance 8aaa4e97-9439-4760-9e05-8b248b02074f Apr 18 16:18:52 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-42975aef-8463-40b6-9b5e-5a79d07262bb tempest-ServerStableDeviceRescueTest-1233154848 tempest-ServerStableDeviceRescueTest-1233154848-project-member] Lock "8aaa4e97-9439-4760-9e05-8b248b02074f" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.722s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:18:53 user nova-compute[70975]: DEBUG nova.compute.manager [req-03fe4dd8-7e9c-47b9-8cb4-af48fd89ba3e req-fa3cd727-5e60-4708-a7aa-ee27e4adccaf service nova] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Received event network-vif-plugged-8029e455-c16d-48cd-93e1-cf56c226cc4a {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:18:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-03fe4dd8-7e9c-47b9-8cb4-af48fd89ba3e req-fa3cd727-5e60-4708-a7aa-ee27e4adccaf service nova] Acquiring lock "8aaa4e97-9439-4760-9e05-8b248b02074f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:18:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-03fe4dd8-7e9c-47b9-8cb4-af48fd89ba3e req-fa3cd727-5e60-4708-a7aa-ee27e4adccaf service nova] Lock "8aaa4e97-9439-4760-9e05-8b248b02074f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:18:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-03fe4dd8-7e9c-47b9-8cb4-af48fd89ba3e req-fa3cd727-5e60-4708-a7aa-ee27e4adccaf service nova] Lock "8aaa4e97-9439-4760-9e05-8b248b02074f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:18:53 user nova-compute[70975]: DEBUG nova.compute.manager [req-03fe4dd8-7e9c-47b9-8cb4-af48fd89ba3e req-fa3cd727-5e60-4708-a7aa-ee27e4adccaf service nova] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] No waiting events found dispatching network-vif-plugged-8029e455-c16d-48cd-93e1-cf56c226cc4a {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:18:53 user nova-compute[70975]: WARNING nova.compute.manager [req-03fe4dd8-7e9c-47b9-8cb4-af48fd89ba3e req-fa3cd727-5e60-4708-a7aa-ee27e4adccaf service nova] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Received unexpected event network-vif-plugged-8029e455-c16d-48cd-93e1-cf56c226cc4a for instance with vm_state deleted and task_state None. Apr 18 16:18:55 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-a4458037-ace2-4fff-bcc5-837055782440 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Acquiring lock "d7a293bf-a9bd-424e-ba11-bbed7dfea41c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:18:55 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-a4458037-ace2-4fff-bcc5-837055782440 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lock "d7a293bf-a9bd-424e-ba11-bbed7dfea41c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:18:55 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-a4458037-ace2-4fff-bcc5-837055782440 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Acquiring lock "d7a293bf-a9bd-424e-ba11-bbed7dfea41c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:18:55 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-a4458037-ace2-4fff-bcc5-837055782440 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lock "d7a293bf-a9bd-424e-ba11-bbed7dfea41c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:18:55 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-a4458037-ace2-4fff-bcc5-837055782440 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lock "d7a293bf-a9bd-424e-ba11-bbed7dfea41c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:18:55 user nova-compute[70975]: INFO nova.compute.manager [None req-a4458037-ace2-4fff-bcc5-837055782440 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Terminating instance Apr 18 16:18:55 user nova-compute[70975]: DEBUG nova.compute.manager [None req-a4458037-ace2-4fff-bcc5-837055782440 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Start destroying the instance on the hypervisor. {{(pid=70975) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 18 16:18:55 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:55 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:55 user nova-compute[70975]: DEBUG nova.compute.manager [req-1dd30b72-57f0-493d-9ff7-4a43d2ec8201 req-5f22bb7a-5526-4d68-bbec-1203d6da5778 service nova] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Received event network-vif-unplugged-e5d69d5c-1a5c-4300-ab15-e73f78388f0e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:18:55 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-1dd30b72-57f0-493d-9ff7-4a43d2ec8201 req-5f22bb7a-5526-4d68-bbec-1203d6da5778 service nova] Acquiring lock "d7a293bf-a9bd-424e-ba11-bbed7dfea41c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:18:55 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-1dd30b72-57f0-493d-9ff7-4a43d2ec8201 req-5f22bb7a-5526-4d68-bbec-1203d6da5778 service nova] Lock "d7a293bf-a9bd-424e-ba11-bbed7dfea41c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:18:55 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-1dd30b72-57f0-493d-9ff7-4a43d2ec8201 req-5f22bb7a-5526-4d68-bbec-1203d6da5778 service nova] Lock "d7a293bf-a9bd-424e-ba11-bbed7dfea41c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:18:55 user nova-compute[70975]: DEBUG nova.compute.manager [req-1dd30b72-57f0-493d-9ff7-4a43d2ec8201 req-5f22bb7a-5526-4d68-bbec-1203d6da5778 service nova] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] No waiting events found dispatching network-vif-unplugged-e5d69d5c-1a5c-4300-ab15-e73f78388f0e {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:18:55 user nova-compute[70975]: DEBUG nova.compute.manager [req-1dd30b72-57f0-493d-9ff7-4a43d2ec8201 req-5f22bb7a-5526-4d68-bbec-1203d6da5778 service nova] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Received event network-vif-unplugged-e5d69d5c-1a5c-4300-ab15-e73f78388f0e for instance with task_state deleting. {{(pid=70975) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 18 16:18:55 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:55 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:55 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:55 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:55 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Instance destroyed successfully. Apr 18 16:18:55 user nova-compute[70975]: DEBUG nova.objects.instance [None req-a4458037-ace2-4fff-bcc5-837055782440 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lazy-loading 'resources' on Instance uuid d7a293bf-a9bd-424e-ba11-bbed7dfea41c {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:18:55 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-a4458037-ace2-4fff-bcc5-837055782440 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:14:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1351031695',display_name='tempest-ServerRescueNegativeTestJSON-server-1351031695',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-1351031695',id=4,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-18T16:16:06Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='261e8ba82d9e4203917afb0241a3b4fc',ramdisk_id='',reservation_id='r-aw8jyd7h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerRescueNegativeTestJSON-1586888284',owner_user_name='tempest-ServerRescueNegativeTestJSON-1586888284-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-18T16:16:07Z,user_data=None,user_id='a8a3f45f9c6c431781fb582b8da22b0b',uuid=d7a293bf-a9bd-424e-ba11-bbed7dfea41c,vcpu_model=,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "e5d69d5c-1a5c-4300-ab15-e73f78388f0e", "address": "fa:16:3e:92:2d:7f", "network": {"id": "1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1814061150-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "261e8ba82d9e4203917afb0241a3b4fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape5d69d5c-1a", "ovs_interfaceid": "e5d69d5c-1a5c-4300-ab15-e73f78388f0e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 18 16:18:55 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-a4458037-ace2-4fff-bcc5-837055782440 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Converting VIF {"id": "e5d69d5c-1a5c-4300-ab15-e73f78388f0e", "address": "fa:16:3e:92:2d:7f", "network": {"id": "1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1814061150-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "261e8ba82d9e4203917afb0241a3b4fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape5d69d5c-1a", "ovs_interfaceid": "e5d69d5c-1a5c-4300-ab15-e73f78388f0e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:18:55 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-a4458037-ace2-4fff-bcc5-837055782440 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:92:2d:7f,bridge_name='br-int',has_traffic_filtering=True,id=e5d69d5c-1a5c-4300-ab15-e73f78388f0e,network=Network(1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5d69d5c-1a') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:18:55 user nova-compute[70975]: DEBUG os_vif [None req-a4458037-ace2-4fff-bcc5-837055782440 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:92:2d:7f,bridge_name='br-int',has_traffic_filtering=True,id=e5d69d5c-1a5c-4300-ab15-e73f78388f0e,network=Network(1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5d69d5c-1a') {{(pid=70975) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 18 16:18:55 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:55 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape5d69d5c-1a, bridge=br-int, if_exists=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:18:55 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:18:55 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:18:55 user nova-compute[70975]: INFO os_vif [None req-a4458037-ace2-4fff-bcc5-837055782440 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:92:2d:7f,bridge_name='br-int',has_traffic_filtering=True,id=e5d69d5c-1a5c-4300-ab15-e73f78388f0e,network=Network(1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5d69d5c-1a') Apr 18 16:18:55 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-a4458037-ace2-4fff-bcc5-837055782440 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Deleting instance files /opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c_del Apr 18 16:18:55 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-a4458037-ace2-4fff-bcc5-837055782440 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Deletion of /opt/stack/data/nova/instances/d7a293bf-a9bd-424e-ba11-bbed7dfea41c_del complete Apr 18 16:18:55 user nova-compute[70975]: INFO nova.compute.manager [None req-a4458037-ace2-4fff-bcc5-837055782440 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Took 0.74 seconds to destroy the instance on the hypervisor. Apr 18 16:18:55 user nova-compute[70975]: DEBUG oslo.service.loopingcall [None req-a4458037-ace2-4fff-bcc5-837055782440 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70975) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 18 16:18:55 user nova-compute[70975]: DEBUG nova.compute.manager [-] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Deallocating network for instance {{(pid=70975) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 18 16:18:55 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] deallocate_for_instance() {{(pid=70975) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 18 16:18:56 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Updating instance_info_cache with network_info: [] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:18:56 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Took 0.52 seconds to deallocate network for instance. Apr 18 16:18:56 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-a4458037-ace2-4fff-bcc5-837055782440 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:18:56 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-a4458037-ace2-4fff-bcc5-837055782440 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:18:56 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-a4458037-ace2-4fff-bcc5-837055782440 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:18:56 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-a4458037-ace2-4fff-bcc5-837055782440 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:18:56 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-a4458037-ace2-4fff-bcc5-837055782440 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.249s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:18:56 user nova-compute[70975]: INFO nova.scheduler.client.report [None req-a4458037-ace2-4fff-bcc5-837055782440 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Deleted allocations for instance d7a293bf-a9bd-424e-ba11-bbed7dfea41c Apr 18 16:18:56 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-a4458037-ace2-4fff-bcc5-837055782440 tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lock "d7a293bf-a9bd-424e-ba11-bbed7dfea41c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.692s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:18:57 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:18:57 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:18:57 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70975) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 18 16:18:57 user nova-compute[70975]: DEBUG nova.compute.manager [req-3c84ce04-9dc2-412c-9511-b5297a2cc038 req-9bd755be-c249-4bda-b244-23b52d24db09 service nova] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Received event network-vif-plugged-e5d69d5c-1a5c-4300-ab15-e73f78388f0e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:18:57 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-3c84ce04-9dc2-412c-9511-b5297a2cc038 req-9bd755be-c249-4bda-b244-23b52d24db09 service nova] Acquiring lock "d7a293bf-a9bd-424e-ba11-bbed7dfea41c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:18:57 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-3c84ce04-9dc2-412c-9511-b5297a2cc038 req-9bd755be-c249-4bda-b244-23b52d24db09 service nova] Lock "d7a293bf-a9bd-424e-ba11-bbed7dfea41c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:18:57 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-3c84ce04-9dc2-412c-9511-b5297a2cc038 req-9bd755be-c249-4bda-b244-23b52d24db09 service nova] Lock "d7a293bf-a9bd-424e-ba11-bbed7dfea41c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:18:57 user nova-compute[70975]: DEBUG nova.compute.manager [req-3c84ce04-9dc2-412c-9511-b5297a2cc038 req-9bd755be-c249-4bda-b244-23b52d24db09 service nova] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] No waiting events found dispatching network-vif-plugged-e5d69d5c-1a5c-4300-ab15-e73f78388f0e {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:18:57 user nova-compute[70975]: WARNING nova.compute.manager [req-3c84ce04-9dc2-412c-9511-b5297a2cc038 req-9bd755be-c249-4bda-b244-23b52d24db09 service nova] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Received unexpected event network-vif-plugged-e5d69d5c-1a5c-4300-ab15-e73f78388f0e for instance with vm_state deleted and task_state None. Apr 18 16:18:57 user nova-compute[70975]: DEBUG nova.compute.manager [req-3c84ce04-9dc2-412c-9511-b5297a2cc038 req-9bd755be-c249-4bda-b244-23b52d24db09 service nova] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Received event network-vif-deleted-e5d69d5c-1a5c-4300-ab15-e73f78388f0e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:18:58 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:19:00 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:19:00 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:19:00 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager.update_available_resource {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:19:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:19:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:19:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:19:00 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Auditing locally available compute resources for user (node: user) {{(pid=70975) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 18 16:19:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:19:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:19:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:19:00 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:19:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b71bd3c1-da58-4cb0-abc3-650e11b9d4ce/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:19:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b71bd3c1-da58-4cb0-abc3-650e11b9d4ce/disk --force-share --output=json" returned: 0 in 0.149s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:19:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b71bd3c1-da58-4cb0-abc3-650e11b9d4ce/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:19:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b71bd3c1-da58-4cb0-abc3-650e11b9d4ce/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:19:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/da82d905-1ca1-403d-9598-7561e69b9704/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:19:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/da82d905-1ca1-403d-9598-7561e69b9704/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:19:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/da82d905-1ca1-403d-9598-7561e69b9704/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:19:01 user nova-compute[70975]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:19:01 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: e32196da-a530-4422-8566-5edb01f3cc62] VM Stopped (Lifecycle Event) Apr 18 16:19:01 user nova-compute[70975]: DEBUG nova.compute.manager [None req-85584548-efc1-4a28-ad1c-fd601bed70ce None None] [instance: e32196da-a530-4422-8566-5edb01f3cc62] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:19:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/da82d905-1ca1-403d-9598-7561e69b9704/disk --force-share --output=json" returned: 0 in 0.128s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:19:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:19:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:19:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:19:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:19:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6c592508-0444-4b42-a0b5-e3d8bd97f5ba/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:19:01 user nova-compute[70975]: DEBUG nova.compute.manager [req-baed56d4-a49d-45d6-904f-a367d5e9e0e3 req-b63bd555-33c4-44a0-8a06-fe8bf42d7b8d service nova] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Received event network-changed-bfcdfd2e-b438-4386-bcae-7088ec17c0e6 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:19:01 user nova-compute[70975]: DEBUG nova.compute.manager [req-baed56d4-a49d-45d6-904f-a367d5e9e0e3 req-b63bd555-33c4-44a0-8a06-fe8bf42d7b8d service nova] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Refreshing instance network info cache due to event network-changed-bfcdfd2e-b438-4386-bcae-7088ec17c0e6. {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 18 16:19:01 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-baed56d4-a49d-45d6-904f-a367d5e9e0e3 req-b63bd555-33c4-44a0-8a06-fe8bf42d7b8d service nova] Acquiring lock "refresh_cache-f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:19:01 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-baed56d4-a49d-45d6-904f-a367d5e9e0e3 req-b63bd555-33c4-44a0-8a06-fe8bf42d7b8d service nova] Acquired lock "refresh_cache-f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:19:01 user nova-compute[70975]: DEBUG nova.network.neutron [req-baed56d4-a49d-45d6-904f-a367d5e9e0e3 req-b63bd555-33c4-44a0-8a06-fe8bf42d7b8d service nova] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Refreshing network info cache for port bfcdfd2e-b438-4386-bcae-7088ec17c0e6 {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 18 16:19:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6c592508-0444-4b42-a0b5-e3d8bd97f5ba/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:19:01 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6c592508-0444-4b42-a0b5-e3d8bd97f5ba/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:19:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6c592508-0444-4b42-a0b5-e3d8bd97f5ba/disk --force-share --output=json" returned: 0 in 0.149s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:19:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:19:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:19:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:19:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:19:02 user nova-compute[70975]: DEBUG nova.network.neutron [req-baed56d4-a49d-45d6-904f-a367d5e9e0e3 req-b63bd555-33c4-44a0-8a06-fe8bf42d7b8d service nova] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Updated VIF entry in instance network info cache for port bfcdfd2e-b438-4386-bcae-7088ec17c0e6. {{(pid=70975) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 18 16:19:02 user nova-compute[70975]: DEBUG nova.network.neutron [req-baed56d4-a49d-45d6-904f-a367d5e9e0e3 req-b63bd555-33c4-44a0-8a06-fe8bf42d7b8d service nova] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Updating instance_info_cache with network_info: [{"id": "bfcdfd2e-b438-4386-bcae-7088ec17c0e6", "address": "fa:16:3e:9a:28:d5", "network": {"id": "02aca424-2923-404b-9c66-76bec89f82b7", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1255258227-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.95", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6b4e8d8797be4c0e91b1401538f2eba8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfcdfd2e-b4", "ovs_interfaceid": "bfcdfd2e-b438-4386-bcae-7088ec17c0e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:19:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:02 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:19:02 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:19:02 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Hypervisor/Node resource view: name=user free_ram=8380MB free_disk=26.60694122314453GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70975) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 18 16:19:02 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:19:02 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:19:02 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-baed56d4-a49d-45d6-904f-a367d5e9e0e3 req-b63bd555-33c4-44a0-8a06-fe8bf42d7b8d service nova] Releasing lock "refresh_cache-f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:19:02 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance da82d905-1ca1-403d-9598-7561e69b9704 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:19:02 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 1b530349-680e-4def-86ef-29c340efa175 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:19:02 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 6c592508-0444-4b42-a0b5-e3d8bd97f5ba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:19:02 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:19:02 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance b71bd3c1-da58-4cb0-abc3-650e11b9d4ce actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:19:02 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 6528f05a-9f05-4f35-b991-687e4f47029e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:19:02 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Total usable vcpus: 12, total allocated vcpus: 6 {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 18 16:19:02 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Final resource view: name=user phys_ram=16023MB used_ram=1280MB phys_disk=40GB used_disk=6GB total_vcpus=12 used_vcpus=6 pci_stats=[] {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 18 16:19:03 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:19:03 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:19:03 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Compute_service record updated for user:user {{(pid=70975) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 18 16:19:03 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.315s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:19:03 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-7958912e-1c48-4205-a4ee-99c51e548532 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Acquiring lock "f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:19:03 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-7958912e-1c48-4205-a4ee-99c51e548532 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:19:03 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-7958912e-1c48-4205-a4ee-99c51e548532 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Acquiring lock "f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:19:03 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-7958912e-1c48-4205-a4ee-99c51e548532 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:19:03 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-7958912e-1c48-4205-a4ee-99c51e548532 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:19:03 user nova-compute[70975]: INFO nova.compute.manager [None req-7958912e-1c48-4205-a4ee-99c51e548532 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Terminating instance Apr 18 16:19:03 user nova-compute[70975]: DEBUG nova.compute.manager [None req-7958912e-1c48-4205-a4ee-99c51e548532 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Start destroying the instance on the hypervisor. {{(pid=70975) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 18 16:19:03 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:03 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:03 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:03 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:03 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:03 user nova-compute[70975]: DEBUG nova.compute.manager [req-55839c22-a816-4f92-bd66-8a8b09bf8516 req-3d50e2c0-1969-4f8a-a9cc-4301858f3f60 service nova] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Received event network-vif-unplugged-bfcdfd2e-b438-4386-bcae-7088ec17c0e6 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:19:03 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-55839c22-a816-4f92-bd66-8a8b09bf8516 req-3d50e2c0-1969-4f8a-a9cc-4301858f3f60 service nova] Acquiring lock "f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:19:03 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-55839c22-a816-4f92-bd66-8a8b09bf8516 req-3d50e2c0-1969-4f8a-a9cc-4301858f3f60 service nova] Lock "f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:19:03 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-55839c22-a816-4f92-bd66-8a8b09bf8516 req-3d50e2c0-1969-4f8a-a9cc-4301858f3f60 service nova] Lock "f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:19:03 user nova-compute[70975]: DEBUG nova.compute.manager [req-55839c22-a816-4f92-bd66-8a8b09bf8516 req-3d50e2c0-1969-4f8a-a9cc-4301858f3f60 service nova] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] No waiting events found dispatching network-vif-unplugged-bfcdfd2e-b438-4386-bcae-7088ec17c0e6 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:19:03 user nova-compute[70975]: DEBUG nova.compute.manager [req-55839c22-a816-4f92-bd66-8a8b09bf8516 req-3d50e2c0-1969-4f8a-a9cc-4301858f3f60 service nova] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Received event network-vif-unplugged-bfcdfd2e-b438-4386-bcae-7088ec17c0e6 for instance with task_state deleting. {{(pid=70975) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 18 16:19:03 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Instance destroyed successfully. Apr 18 16:19:03 user nova-compute[70975]: DEBUG nova.objects.instance [None req-7958912e-1c48-4205-a4ee-99c51e548532 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lazy-loading 'resources' on Instance uuid f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:19:03 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-7958912e-1c48-4205-a4ee-99c51e548532 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:17:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-14343453',display_name='tempest-AttachVolumeNegativeTest-server-14343453',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-14343453',id=12,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI2SDBHv75l7hW3tiq5hWHFRDYyei1QQIo9CQRQFQISK8RVXUcgtsJBeI8pkGbxlcETA/pFpFNDAjbdgyUlN3UoIYqsksl/hRT8/J7etZF7prNIypo7A3UV/2lzY82gGhg==',key_name='tempest-keypair-1742241088',keypairs=,launch_index=0,launched_at=2023-04-18T16:17:16Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='6b4e8d8797be4c0e91b1401538f2eba8',ramdisk_id='',reservation_id='r-rlj93r3i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeNegativeTest-216357456',owner_user_name='tempest-AttachVolumeNegativeTest-216357456-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-18T16:17:16Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='af90e17ec027463fa8793e8539c39e13',uuid=f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bfcdfd2e-b438-4386-bcae-7088ec17c0e6", "address": "fa:16:3e:9a:28:d5", "network": {"id": "02aca424-2923-404b-9c66-76bec89f82b7", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1255258227-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.95", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6b4e8d8797be4c0e91b1401538f2eba8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfcdfd2e-b4", "ovs_interfaceid": "bfcdfd2e-b438-4386-bcae-7088ec17c0e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 18 16:19:03 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-7958912e-1c48-4205-a4ee-99c51e548532 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Converting VIF {"id": "bfcdfd2e-b438-4386-bcae-7088ec17c0e6", "address": "fa:16:3e:9a:28:d5", "network": {"id": "02aca424-2923-404b-9c66-76bec89f82b7", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1255258227-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.95", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6b4e8d8797be4c0e91b1401538f2eba8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfcdfd2e-b4", "ovs_interfaceid": "bfcdfd2e-b438-4386-bcae-7088ec17c0e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:19:03 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-7958912e-1c48-4205-a4ee-99c51e548532 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9a:28:d5,bridge_name='br-int',has_traffic_filtering=True,id=bfcdfd2e-b438-4386-bcae-7088ec17c0e6,network=Network(02aca424-2923-404b-9c66-76bec89f82b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfcdfd2e-b4') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:19:03 user nova-compute[70975]: DEBUG os_vif [None req-7958912e-1c48-4205-a4ee-99c51e548532 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9a:28:d5,bridge_name='br-int',has_traffic_filtering=True,id=bfcdfd2e-b438-4386-bcae-7088ec17c0e6,network=Network(02aca424-2923-404b-9c66-76bec89f82b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfcdfd2e-b4') {{(pid=70975) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 18 16:19:03 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:03 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbfcdfd2e-b4, bridge=br-int, if_exists=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:19:03 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:03 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:19:03 user nova-compute[70975]: INFO os_vif [None req-7958912e-1c48-4205-a4ee-99c51e548532 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9a:28:d5,bridge_name='br-int',has_traffic_filtering=True,id=bfcdfd2e-b438-4386-bcae-7088ec17c0e6,network=Network(02aca424-2923-404b-9c66-76bec89f82b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfcdfd2e-b4') Apr 18 16:19:03 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-7958912e-1c48-4205-a4ee-99c51e548532 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Deleting instance files /opt/stack/data/nova/instances/f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc_del Apr 18 16:19:03 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-7958912e-1c48-4205-a4ee-99c51e548532 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Deletion of /opt/stack/data/nova/instances/f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc_del complete Apr 18 16:19:03 user nova-compute[70975]: INFO nova.compute.manager [None req-7958912e-1c48-4205-a4ee-99c51e548532 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Took 0.64 seconds to destroy the instance on the hypervisor. Apr 18 16:19:03 user nova-compute[70975]: DEBUG oslo.service.loopingcall [None req-7958912e-1c48-4205-a4ee-99c51e548532 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70975) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 18 16:19:03 user nova-compute[70975]: DEBUG nova.compute.manager [-] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Deallocating network for instance {{(pid=70975) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 18 16:19:03 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] deallocate_for_instance() {{(pid=70975) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 18 16:19:04 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:19:04 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:19:04 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Starting heal instance info cache {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 18 16:19:04 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "refresh_cache-6c592508-0444-4b42-a0b5-e3d8bd97f5ba" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:19:04 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquired lock "refresh_cache-6c592508-0444-4b42-a0b5-e3d8bd97f5ba" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:19:04 user nova-compute[70975]: DEBUG nova.network.neutron [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Forcefully refreshing network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 18 16:19:04 user nova-compute[70975]: DEBUG nova.network.neutron [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Updating instance_info_cache with network_info: [{"id": "395afd81-e898-47ee-a928-eaab584d5b4e", "address": "fa:16:3e:fa:1c:ad", "network": {"id": "0fafa4bf-5d8d-43f9-8cc3-ae6e7c58812c", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-891115046-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.120", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8edf93a24e754e1ea58c0a7fd4f553dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap395afd81-e8", "ovs_interfaceid": "395afd81-e898-47ee-a928-eaab584d5b4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:19:04 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Releasing lock "refresh_cache-6c592508-0444-4b42-a0b5-e3d8bd97f5ba" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:19:04 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Updated the network info_cache for instance {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 18 16:19:04 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:19:04 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Updating instance_info_cache with network_info: [] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:19:04 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Took 0.91 seconds to deallocate network for instance. Apr 18 16:19:04 user nova-compute[70975]: DEBUG nova.compute.manager [req-6fd2eaa9-c437-4d1e-90cd-46a7dd067f74 req-67faecec-faaf-4c6b-bf34-bcc89f2da162 service nova] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Received event network-vif-deleted-bfcdfd2e-b438-4386-bcae-7088ec17c0e6 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:19:04 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-7958912e-1c48-4205-a4ee-99c51e548532 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:19:04 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-7958912e-1c48-4205-a4ee-99c51e548532 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.002s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:19:05 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-7958912e-1c48-4205-a4ee-99c51e548532 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:19:05 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-7958912e-1c48-4205-a4ee-99c51e548532 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:19:05 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-7958912e-1c48-4205-a4ee-99c51e548532 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.271s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:19:05 user nova-compute[70975]: INFO nova.scheduler.client.report [None req-7958912e-1c48-4205-a4ee-99c51e548532 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Deleted allocations for instance f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc Apr 18 16:19:05 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-7958912e-1c48-4205-a4ee-99c51e548532 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.997s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:19:05 user nova-compute[70975]: DEBUG nova.compute.manager [req-4b033ef2-9578-486a-80cc-edabb29ca4e7 req-fe7e8126-9f28-4760-8dd2-e7fdeb4464e2 service nova] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Received event network-vif-plugged-bfcdfd2e-b438-4386-bcae-7088ec17c0e6 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:19:05 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-4b033ef2-9578-486a-80cc-edabb29ca4e7 req-fe7e8126-9f28-4760-8dd2-e7fdeb4464e2 service nova] Acquiring lock "f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:19:05 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-4b033ef2-9578-486a-80cc-edabb29ca4e7 req-fe7e8126-9f28-4760-8dd2-e7fdeb4464e2 service nova] Lock "f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:19:05 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-4b033ef2-9578-486a-80cc-edabb29ca4e7 req-fe7e8126-9f28-4760-8dd2-e7fdeb4464e2 service nova] Lock "f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:19:05 user nova-compute[70975]: DEBUG nova.compute.manager [req-4b033ef2-9578-486a-80cc-edabb29ca4e7 req-fe7e8126-9f28-4760-8dd2-e7fdeb4464e2 service nova] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] No waiting events found dispatching network-vif-plugged-bfcdfd2e-b438-4386-bcae-7088ec17c0e6 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:19:05 user nova-compute[70975]: WARNING nova.compute.manager [req-4b033ef2-9578-486a-80cc-edabb29ca4e7 req-fe7e8126-9f28-4760-8dd2-e7fdeb4464e2 service nova] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Received unexpected event network-vif-plugged-bfcdfd2e-b438-4386-bcae-7088ec17c0e6 for instance with vm_state deleted and task_state None. Apr 18 16:19:06 user nova-compute[70975]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:19:06 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] VM Stopped (Lifecycle Event) Apr 18 16:19:06 user nova-compute[70975]: DEBUG nova.compute.manager [None req-3aeb78e7-f0aa-476d-bf2c-15878ea79792 None None] [instance: 8aaa4e97-9439-4760-9e05-8b248b02074f] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:19:07 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:08 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:10 user nova-compute[70975]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:19:10 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] VM Stopped (Lifecycle Event) Apr 18 16:19:10 user nova-compute[70975]: DEBUG nova.compute.manager [None req-dc9e41f8-dfde-49f6-83f7-15805d5e8fe7 None None] [instance: d7a293bf-a9bd-424e-ba11-bbed7dfea41c] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:19:12 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:13 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:18 user nova-compute[70975]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:19:18 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] VM Stopped (Lifecycle Event) Apr 18 16:19:18 user nova-compute[70975]: DEBUG nova.compute.manager [None req-a4531314-7b36-4171-86dc-489e4c569e26 None None] [instance: f39eb5ec-4b8d-4ff7-8b47-aa1c34cfa3cc] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:19:18 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:20 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-d3770084-766c-40ad-90d4-e8319ab3d1c8 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Acquiring lock "6c592508-0444-4b42-a0b5-e3d8bd97f5ba" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:19:20 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-d3770084-766c-40ad-90d4-e8319ab3d1c8 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lock "6c592508-0444-4b42-a0b5-e3d8bd97f5ba" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:19:20 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-d3770084-766c-40ad-90d4-e8319ab3d1c8 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Acquiring lock "6c592508-0444-4b42-a0b5-e3d8bd97f5ba-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:19:20 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-d3770084-766c-40ad-90d4-e8319ab3d1c8 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lock "6c592508-0444-4b42-a0b5-e3d8bd97f5ba-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:19:20 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-d3770084-766c-40ad-90d4-e8319ab3d1c8 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lock "6c592508-0444-4b42-a0b5-e3d8bd97f5ba-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:19:20 user nova-compute[70975]: INFO nova.compute.manager [None req-d3770084-766c-40ad-90d4-e8319ab3d1c8 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Terminating instance Apr 18 16:19:20 user nova-compute[70975]: DEBUG nova.compute.manager [None req-d3770084-766c-40ad-90d4-e8319ab3d1c8 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Start destroying the instance on the hypervisor. {{(pid=70975) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 18 16:19:20 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:20 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:20 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:20 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:20 user nova-compute[70975]: DEBUG nova.compute.manager [req-8db5cd04-5ad4-414c-a7a9-b72a9e4237bc req-c6dd5208-57c0-486b-a96a-e69ee08668bd service nova] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Received event network-vif-unplugged-395afd81-e898-47ee-a928-eaab584d5b4e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:19:20 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-8db5cd04-5ad4-414c-a7a9-b72a9e4237bc req-c6dd5208-57c0-486b-a96a-e69ee08668bd service nova] Acquiring lock "6c592508-0444-4b42-a0b5-e3d8bd97f5ba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:19:20 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-8db5cd04-5ad4-414c-a7a9-b72a9e4237bc req-c6dd5208-57c0-486b-a96a-e69ee08668bd service nova] Lock "6c592508-0444-4b42-a0b5-e3d8bd97f5ba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:19:20 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-8db5cd04-5ad4-414c-a7a9-b72a9e4237bc req-c6dd5208-57c0-486b-a96a-e69ee08668bd service nova] Lock "6c592508-0444-4b42-a0b5-e3d8bd97f5ba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:19:20 user nova-compute[70975]: DEBUG nova.compute.manager [req-8db5cd04-5ad4-414c-a7a9-b72a9e4237bc req-c6dd5208-57c0-486b-a96a-e69ee08668bd service nova] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] No waiting events found dispatching network-vif-unplugged-395afd81-e898-47ee-a928-eaab584d5b4e {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:19:20 user nova-compute[70975]: DEBUG nova.compute.manager [req-8db5cd04-5ad4-414c-a7a9-b72a9e4237bc req-c6dd5208-57c0-486b-a96a-e69ee08668bd service nova] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Received event network-vif-unplugged-395afd81-e898-47ee-a928-eaab584d5b4e for instance with task_state deleting. {{(pid=70975) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 18 16:19:20 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:20 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Instance destroyed successfully. Apr 18 16:19:20 user nova-compute[70975]: DEBUG nova.objects.instance [None req-d3770084-766c-40ad-90d4-e8319ab3d1c8 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lazy-loading 'resources' on Instance uuid 6c592508-0444-4b42-a0b5-e3d8bd97f5ba {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:19:20 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-d3770084-766c-40ad-90d4-e8319ab3d1c8 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:14:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-370003702',display_name='tempest-VolumesAdminNegativeTest-server-370003702',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-370003702',id=7,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAniXSetAQE9Sn51zA8NpTX2dOiul2qACE7wlThUOvDLY/XUKayPw9h+boGYtqxwA3BNtZbXaC0adc4Uojp5kUY4JmnKz7unbT3y9taLOI+qBOXnUno++8x4d6lIizphZQ==',key_name='tempest-keypair-1220171208',keypairs=,launch_index=0,launched_at=2023-04-18T16:14:54Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='8edf93a24e754e1ea58c0a7fd4f553dc',ramdisk_id='',reservation_id='r-cxt60s0r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-VolumesAdminNegativeTest-2015888259',owner_user_name='tempest-VolumesAdminNegativeTest-2015888259-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-18T16:14:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='299ba2e202244f59a09e22df9ea8efe7',uuid=6c592508-0444-4b42-a0b5-e3d8bd97f5ba,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "395afd81-e898-47ee-a928-eaab584d5b4e", "address": "fa:16:3e:fa:1c:ad", "network": {"id": "0fafa4bf-5d8d-43f9-8cc3-ae6e7c58812c", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-891115046-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.120", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8edf93a24e754e1ea58c0a7fd4f553dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap395afd81-e8", "ovs_interfaceid": "395afd81-e898-47ee-a928-eaab584d5b4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 18 16:19:20 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-d3770084-766c-40ad-90d4-e8319ab3d1c8 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Converting VIF {"id": "395afd81-e898-47ee-a928-eaab584d5b4e", "address": "fa:16:3e:fa:1c:ad", "network": {"id": "0fafa4bf-5d8d-43f9-8cc3-ae6e7c58812c", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-891115046-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.120", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8edf93a24e754e1ea58c0a7fd4f553dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap395afd81-e8", "ovs_interfaceid": "395afd81-e898-47ee-a928-eaab584d5b4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:19:20 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-d3770084-766c-40ad-90d4-e8319ab3d1c8 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fa:1c:ad,bridge_name='br-int',has_traffic_filtering=True,id=395afd81-e898-47ee-a928-eaab584d5b4e,network=Network(0fafa4bf-5d8d-43f9-8cc3-ae6e7c58812c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap395afd81-e8') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:19:20 user nova-compute[70975]: DEBUG os_vif [None req-d3770084-766c-40ad-90d4-e8319ab3d1c8 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:fa:1c:ad,bridge_name='br-int',has_traffic_filtering=True,id=395afd81-e898-47ee-a928-eaab584d5b4e,network=Network(0fafa4bf-5d8d-43f9-8cc3-ae6e7c58812c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap395afd81-e8') {{(pid=70975) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 18 16:19:20 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:20 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap395afd81-e8, bridge=br-int, if_exists=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:19:20 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:20 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:19:20 user nova-compute[70975]: INFO os_vif [None req-d3770084-766c-40ad-90d4-e8319ab3d1c8 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:fa:1c:ad,bridge_name='br-int',has_traffic_filtering=True,id=395afd81-e898-47ee-a928-eaab584d5b4e,network=Network(0fafa4bf-5d8d-43f9-8cc3-ae6e7c58812c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap395afd81-e8') Apr 18 16:19:20 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-d3770084-766c-40ad-90d4-e8319ab3d1c8 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Deleting instance files /opt/stack/data/nova/instances/6c592508-0444-4b42-a0b5-e3d8bd97f5ba_del Apr 18 16:19:20 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-d3770084-766c-40ad-90d4-e8319ab3d1c8 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Deletion of /opt/stack/data/nova/instances/6c592508-0444-4b42-a0b5-e3d8bd97f5ba_del complete Apr 18 16:19:20 user nova-compute[70975]: INFO nova.compute.manager [None req-d3770084-766c-40ad-90d4-e8319ab3d1c8 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Took 0.66 seconds to destroy the instance on the hypervisor. Apr 18 16:19:20 user nova-compute[70975]: DEBUG oslo.service.loopingcall [None req-d3770084-766c-40ad-90d4-e8319ab3d1c8 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70975) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 18 16:19:20 user nova-compute[70975]: DEBUG nova.compute.manager [-] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Deallocating network for instance {{(pid=70975) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 18 16:19:20 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] deallocate_for_instance() {{(pid=70975) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 18 16:19:21 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Updating instance_info_cache with network_info: [] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:19:21 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Took 0.71 seconds to deallocate network for instance. Apr 18 16:19:21 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-d3770084-766c-40ad-90d4-e8319ab3d1c8 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:19:21 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-d3770084-766c-40ad-90d4-e8319ab3d1c8 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:19:21 user nova-compute[70975]: DEBUG nova.compute.manager [req-46be6c8f-ea0f-4d54-bd0c-1718f39ff622 req-2866355b-d60a-4064-8ba7-d10597d7869c service nova] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Received event network-vif-deleted-395afd81-e898-47ee-a928-eaab584d5b4e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:19:21 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-d3770084-766c-40ad-90d4-e8319ab3d1c8 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:19:21 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-d3770084-766c-40ad-90d4-e8319ab3d1c8 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:19:21 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-d3770084-766c-40ad-90d4-e8319ab3d1c8 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.244s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:19:21 user nova-compute[70975]: INFO nova.scheduler.client.report [None req-d3770084-766c-40ad-90d4-e8319ab3d1c8 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Deleted allocations for instance 6c592508-0444-4b42-a0b5-e3d8bd97f5ba Apr 18 16:19:21 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-d3770084-766c-40ad-90d4-e8319ab3d1c8 tempest-VolumesAdminNegativeTest-2015888259 tempest-VolumesAdminNegativeTest-2015888259-project-member] Lock "6c592508-0444-4b42-a0b5-e3d8bd97f5ba" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.799s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:19:22 user nova-compute[70975]: DEBUG nova.compute.manager [req-4697d80f-660e-488b-a1f1-902c582d6dcd req-98edf043-4549-4432-b27a-b9d5186c6278 service nova] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Received event network-vif-plugged-395afd81-e898-47ee-a928-eaab584d5b4e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:19:22 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-4697d80f-660e-488b-a1f1-902c582d6dcd req-98edf043-4549-4432-b27a-b9d5186c6278 service nova] Acquiring lock "6c592508-0444-4b42-a0b5-e3d8bd97f5ba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:19:22 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-4697d80f-660e-488b-a1f1-902c582d6dcd req-98edf043-4549-4432-b27a-b9d5186c6278 service nova] Lock "6c592508-0444-4b42-a0b5-e3d8bd97f5ba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:19:22 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-4697d80f-660e-488b-a1f1-902c582d6dcd req-98edf043-4549-4432-b27a-b9d5186c6278 service nova] Lock "6c592508-0444-4b42-a0b5-e3d8bd97f5ba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:19:22 user nova-compute[70975]: DEBUG nova.compute.manager [req-4697d80f-660e-488b-a1f1-902c582d6dcd req-98edf043-4549-4432-b27a-b9d5186c6278 service nova] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] No waiting events found dispatching network-vif-plugged-395afd81-e898-47ee-a928-eaab584d5b4e {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:19:22 user nova-compute[70975]: WARNING nova.compute.manager [req-4697d80f-660e-488b-a1f1-902c582d6dcd req-98edf043-4549-4432-b27a-b9d5186c6278 service nova] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Received unexpected event network-vif-plugged-395afd81-e898-47ee-a928-eaab584d5b4e for instance with vm_state deleted and task_state None. Apr 18 16:19:25 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:30 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:19:35 user nova-compute[70975]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:19:35 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] VM Stopped (Lifecycle Event) Apr 18 16:19:35 user nova-compute[70975]: DEBUG nova.compute.manager [None req-3107181e-cc60-4731-a009-196c13223856 None None] [instance: 6c592508-0444-4b42-a0b5-e3d8bd97f5ba] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:19:35 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:19:35 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:35 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=70975) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 18 16:19:35 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70975) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 18 16:19:35 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70975) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 18 16:19:35 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:38 user nova-compute[70975]: DEBUG nova.compute.manager [req-854e85b8-b1a6-4ce2-8e52-d6deb67ac43f req-b918a2dd-dba8-4061-bf0e-9dfb99697e29 service nova] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Received event network-changed-4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:19:38 user nova-compute[70975]: DEBUG nova.compute.manager [req-854e85b8-b1a6-4ce2-8e52-d6deb67ac43f req-b918a2dd-dba8-4061-bf0e-9dfb99697e29 service nova] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Refreshing instance network info cache due to event network-changed-4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010. {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 18 16:19:38 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-854e85b8-b1a6-4ce2-8e52-d6deb67ac43f req-b918a2dd-dba8-4061-bf0e-9dfb99697e29 service nova] Acquiring lock "refresh_cache-b71bd3c1-da58-4cb0-abc3-650e11b9d4ce" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:19:38 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-854e85b8-b1a6-4ce2-8e52-d6deb67ac43f req-b918a2dd-dba8-4061-bf0e-9dfb99697e29 service nova] Acquired lock "refresh_cache-b71bd3c1-da58-4cb0-abc3-650e11b9d4ce" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:19:38 user nova-compute[70975]: DEBUG nova.network.neutron [req-854e85b8-b1a6-4ce2-8e52-d6deb67ac43f req-b918a2dd-dba8-4061-bf0e-9dfb99697e29 service nova] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Refreshing network info cache for port 4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010 {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 18 16:19:39 user nova-compute[70975]: DEBUG nova.network.neutron [req-854e85b8-b1a6-4ce2-8e52-d6deb67ac43f req-b918a2dd-dba8-4061-bf0e-9dfb99697e29 service nova] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Updated VIF entry in instance network info cache for port 4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010. {{(pid=70975) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 18 16:19:39 user nova-compute[70975]: DEBUG nova.network.neutron [req-854e85b8-b1a6-4ce2-8e52-d6deb67ac43f req-b918a2dd-dba8-4061-bf0e-9dfb99697e29 service nova] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Updating instance_info_cache with network_info: [{"id": "4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010", "address": "fa:16:3e:a3:29:06", "network": {"id": "7f49a051-667b-4e91-80de-f4bbf2d6f09e", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-316224389-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d82a93c1cb9b4a4da7114874ddf0aa27", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4aa3a6dd-3c", "ovs_interfaceid": "4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:19:39 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-854e85b8-b1a6-4ce2-8e52-d6deb67ac43f req-b918a2dd-dba8-4061-bf0e-9dfb99697e29 service nova] Releasing lock "refresh_cache-b71bd3c1-da58-4cb0-abc3-650e11b9d4ce" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-4e34231a-e1b7-4e65-bda5-8cab2c332fe0 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Acquiring lock "b71bd3c1-da58-4cb0-abc3-650e11b9d4ce" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-4e34231a-e1b7-4e65-bda5-8cab2c332fe0 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lock "b71bd3c1-da58-4cb0-abc3-650e11b9d4ce" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-4e34231a-e1b7-4e65-bda5-8cab2c332fe0 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Acquiring lock "b71bd3c1-da58-4cb0-abc3-650e11b9d4ce-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-4e34231a-e1b7-4e65-bda5-8cab2c332fe0 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lock "b71bd3c1-da58-4cb0-abc3-650e11b9d4ce-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-4e34231a-e1b7-4e65-bda5-8cab2c332fe0 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lock "b71bd3c1-da58-4cb0-abc3-650e11b9d4ce-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:19:40 user nova-compute[70975]: INFO nova.compute.manager [None req-4e34231a-e1b7-4e65-bda5-8cab2c332fe0 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Terminating instance Apr 18 16:19:40 user nova-compute[70975]: DEBUG nova.compute.manager [None req-4e34231a-e1b7-4e65-bda5-8cab2c332fe0 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Start destroying the instance on the hypervisor. {{(pid=70975) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Acquiring lock "f5496c5f-292e-4912-991b-f834009e51a1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lock "f5496c5f-292e-4912-991b-f834009e51a1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG nova.compute.manager [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Starting instance... {{(pid=70975) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70975) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 18 16:19:40 user nova-compute[70975]: INFO nova.compute.claims [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Claim successful on node user Apr 18 16:19:40 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG nova.compute.manager [req-67772885-bc9e-4502-82f9-5a774b0e490a req-408fe818-8661-4b6c-aae6-83f1485329e9 service nova] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Received event network-vif-unplugged-4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-67772885-bc9e-4502-82f9-5a774b0e490a req-408fe818-8661-4b6c-aae6-83f1485329e9 service nova] Acquiring lock "b71bd3c1-da58-4cb0-abc3-650e11b9d4ce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-67772885-bc9e-4502-82f9-5a774b0e490a req-408fe818-8661-4b6c-aae6-83f1485329e9 service nova] Lock "b71bd3c1-da58-4cb0-abc3-650e11b9d4ce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-67772885-bc9e-4502-82f9-5a774b0e490a req-408fe818-8661-4b6c-aae6-83f1485329e9 service nova] Lock "b71bd3c1-da58-4cb0-abc3-650e11b9d4ce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG nova.compute.manager [req-67772885-bc9e-4502-82f9-5a774b0e490a req-408fe818-8661-4b6c-aae6-83f1485329e9 service nova] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] No waiting events found dispatching network-vif-unplugged-4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG nova.compute.manager [req-67772885-bc9e-4502-82f9-5a774b0e490a req-408fe818-8661-4b6c-aae6-83f1485329e9 service nova] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Received event network-vif-unplugged-4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010 for instance with task_state deleting. {{(pid=70975) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.303s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG nova.compute.manager [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Start building networks asynchronously for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG nova.compute.manager [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Allocating IP information in the background. {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG nova.network.neutron [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: f5496c5f-292e-4912-991b-f834009e51a1] allocate_for_instance() {{(pid=70975) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 18 16:19:40 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 18 16:19:40 user nova-compute[70975]: DEBUG nova.compute.manager [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Start building block device mappings for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG nova.policy [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '73a99bbf510f4f67bb7a35901ba3edc5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f9987eeaa6b24ca48e80e8d5318f02ac', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70975) authorize /opt/stack/nova/nova/policy.py:203}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG nova.compute.manager [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Start spawning the instance on the hypervisor. {{(pid=70975) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Creating instance directory {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 18 16:19:40 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Creating image(s) Apr 18 16:19:40 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Acquiring lock "/opt/stack/data/nova/instances/f5496c5f-292e-4912-991b-f834009e51a1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lock "/opt/stack/data/nova/instances/f5496c5f-292e-4912-991b-f834009e51a1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lock "/opt/stack/data/nova/instances/f5496c5f-292e-4912-991b-f834009e51a1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:40 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Instance destroyed successfully. Apr 18 16:19:40 user nova-compute[70975]: DEBUG nova.objects.instance [None req-4e34231a-e1b7-4e65-bda5-8cab2c332fe0 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lazy-loading 'resources' on Instance uuid b71bd3c1-da58-4cb0-abc3-650e11b9d4ce {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.138s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Acquiring lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:19:40 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:19:41 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-4e34231a-e1b7-4e65-bda5-8cab2c332fe0 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:17:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-2075963637',display_name='tempest-AttachVolumeTestJSON-server-2075963637',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-2075963637',id=13,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAkq9Vg7VDQCpKpiGFoZfkEz1qZcQquI3n1H/unrAhcJuN8Zdg6SoPHia4dOkiKjV573Nr9cV3ZtHK+a5VfiLfEY5Cki6rbV4aTWzAjQWI/N4FbFpvBWX1A+Usn/9nq2QA==',key_name='tempest-keypair-1743850703',keypairs=,launch_index=0,launched_at=2023-04-18T16:17:54Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='d82a93c1cb9b4a4da7114874ddf0aa27',ramdisk_id='',reservation_id='r-00coyh3s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeTestJSON-313351389',owner_user_name='tempest-AttachVolumeTestJSON-313351389-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-18T16:17:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='fd46686fd5b845cca0f3d9452a86f4ca',uuid=b71bd3c1-da58-4cb0-abc3-650e11b9d4ce,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010", "address": "fa:16:3e:a3:29:06", "network": {"id": "7f49a051-667b-4e91-80de-f4bbf2d6f09e", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-316224389-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d82a93c1cb9b4a4da7114874ddf0aa27", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4aa3a6dd-3c", "ovs_interfaceid": "4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 18 16:19:41 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-4e34231a-e1b7-4e65-bda5-8cab2c332fe0 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Converting VIF {"id": "4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010", "address": "fa:16:3e:a3:29:06", "network": {"id": "7f49a051-667b-4e91-80de-f4bbf2d6f09e", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-316224389-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d82a93c1cb9b4a4da7114874ddf0aa27", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4aa3a6dd-3c", "ovs_interfaceid": "4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:19:41 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-4e34231a-e1b7-4e65-bda5-8cab2c332fe0 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a3:29:06,bridge_name='br-int',has_traffic_filtering=True,id=4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010,network=Network(7f49a051-667b-4e91-80de-f4bbf2d6f09e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4aa3a6dd-3c') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:19:41 user nova-compute[70975]: DEBUG os_vif [None req-4e34231a-e1b7-4e65-bda5-8cab2c332fe0 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a3:29:06,bridge_name='br-int',has_traffic_filtering=True,id=4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010,network=Network(7f49a051-667b-4e91-80de-f4bbf2d6f09e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4aa3a6dd-3c') {{(pid=70975) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 18 16:19:41 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:41 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4aa3a6dd-3c, bridge=br-int, if_exists=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:19:41 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:41 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:19:41 user nova-compute[70975]: INFO os_vif [None req-4e34231a-e1b7-4e65-bda5-8cab2c332fe0 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a3:29:06,bridge_name='br-int',has_traffic_filtering=True,id=4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010,network=Network(7f49a051-667b-4e91-80de-f4bbf2d6f09e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4aa3a6dd-3c') Apr 18 16:19:41 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-4e34231a-e1b7-4e65-bda5-8cab2c332fe0 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Deleting instance files /opt/stack/data/nova/instances/b71bd3c1-da58-4cb0-abc3-650e11b9d4ce_del Apr 18 16:19:41 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-4e34231a-e1b7-4e65-bda5-8cab2c332fe0 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Deletion of /opt/stack/data/nova/instances/b71bd3c1-da58-4cb0-abc3-650e11b9d4ce_del complete Apr 18 16:19:41 user nova-compute[70975]: INFO nova.compute.manager [None req-4e34231a-e1b7-4e65-bda5-8cab2c332fe0 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Took 0.91 seconds to destroy the instance on the hypervisor. Apr 18 16:19:41 user nova-compute[70975]: DEBUG oslo.service.loopingcall [None req-4e34231a-e1b7-4e65-bda5-8cab2c332fe0 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70975) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 18 16:19:41 user nova-compute[70975]: DEBUG nova.compute.manager [-] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Deallocating network for instance {{(pid=70975) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 18 16:19:41 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] deallocate_for_instance() {{(pid=70975) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 18 16:19:41 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.136s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:19:41 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/f5496c5f-292e-4912-991b-f834009e51a1/disk 1073741824 {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:19:41 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/f5496c5f-292e-4912-991b-f834009e51a1/disk 1073741824" returned: 0 in 0.049s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:19:41 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.189s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:19:41 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:19:41 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.136s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:19:41 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Checking if we can resize image /opt/stack/data/nova/instances/f5496c5f-292e-4912-991b-f834009e51a1/disk. size=1073741824 {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 18 16:19:41 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f5496c5f-292e-4912-991b-f834009e51a1/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:19:41 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f5496c5f-292e-4912-991b-f834009e51a1/disk --force-share --output=json" returned: 0 in 0.152s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:19:41 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Cannot resize image /opt/stack/data/nova/instances/f5496c5f-292e-4912-991b-f834009e51a1/disk to a smaller size. {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 18 16:19:41 user nova-compute[70975]: DEBUG nova.objects.instance [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lazy-loading 'migration_context' on Instance uuid f5496c5f-292e-4912-991b-f834009e51a1 {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:19:41 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Created local disks {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 18 16:19:41 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Ensure instance console log exists: /opt/stack/data/nova/instances/f5496c5f-292e-4912-991b-f834009e51a1/console.log {{(pid=70975) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 18 16:19:41 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:19:41 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:19:41 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:19:41 user nova-compute[70975]: DEBUG nova.network.neutron [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Successfully created port: 4d7beeed-1a0b-490b-a788-3b8442f86758 {{(pid=70975) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 18 16:19:41 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Updating instance_info_cache with network_info: [] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:19:41 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Took 0.78 seconds to deallocate network for instance. Apr 18 16:19:41 user nova-compute[70975]: DEBUG nova.compute.manager [req-62f72798-e66d-4c23-9200-033f7ff77c52 req-f358ca67-3140-41ca-9bd6-08f0f27faa16 service nova] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Received event network-vif-deleted-4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:19:41 user nova-compute[70975]: INFO nova.compute.manager [req-62f72798-e66d-4c23-9200-033f7ff77c52 req-f358ca67-3140-41ca-9bd6-08f0f27faa16 service nova] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Neutron deleted interface 4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010; detaching it from the instance and deleting it from the info cache Apr 18 16:19:41 user nova-compute[70975]: DEBUG nova.network.neutron [req-62f72798-e66d-4c23-9200-033f7ff77c52 req-f358ca67-3140-41ca-9bd6-08f0f27faa16 service nova] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Updating instance_info_cache with network_info: [] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:19:41 user nova-compute[70975]: DEBUG nova.compute.manager [req-62f72798-e66d-4c23-9200-033f7ff77c52 req-f358ca67-3140-41ca-9bd6-08f0f27faa16 service nova] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Detach interface failed, port_id=4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010, reason: Instance b71bd3c1-da58-4cb0-abc3-650e11b9d4ce could not be found. {{(pid=70975) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 18 16:19:41 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-4e34231a-e1b7-4e65-bda5-8cab2c332fe0 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:19:41 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-4e34231a-e1b7-4e65-bda5-8cab2c332fe0 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:19:42 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-4e34231a-e1b7-4e65-bda5-8cab2c332fe0 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:19:42 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-4e34231a-e1b7-4e65-bda5-8cab2c332fe0 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:19:42 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-4e34231a-e1b7-4e65-bda5-8cab2c332fe0 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.222s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:19:42 user nova-compute[70975]: INFO nova.scheduler.client.report [None req-4e34231a-e1b7-4e65-bda5-8cab2c332fe0 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Deleted allocations for instance b71bd3c1-da58-4cb0-abc3-650e11b9d4ce Apr 18 16:19:42 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-4e34231a-e1b7-4e65-bda5-8cab2c332fe0 tempest-AttachVolumeTestJSON-313351389 tempest-AttachVolumeTestJSON-313351389-project-member] Lock "b71bd3c1-da58-4cb0-abc3-650e11b9d4ce" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.120s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:19:42 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:42 user nova-compute[70975]: DEBUG nova.compute.manager [req-f764e83b-4914-478f-a41c-93d8513241e5 req-3fae9166-10bf-4867-93a1-6084edce1ac0 service nova] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Received event network-vif-plugged-4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:19:42 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-f764e83b-4914-478f-a41c-93d8513241e5 req-3fae9166-10bf-4867-93a1-6084edce1ac0 service nova] Acquiring lock "b71bd3c1-da58-4cb0-abc3-650e11b9d4ce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:19:42 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-f764e83b-4914-478f-a41c-93d8513241e5 req-3fae9166-10bf-4867-93a1-6084edce1ac0 service nova] Lock "b71bd3c1-da58-4cb0-abc3-650e11b9d4ce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:19:42 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-f764e83b-4914-478f-a41c-93d8513241e5 req-3fae9166-10bf-4867-93a1-6084edce1ac0 service nova] Lock "b71bd3c1-da58-4cb0-abc3-650e11b9d4ce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:19:42 user nova-compute[70975]: DEBUG nova.compute.manager [req-f764e83b-4914-478f-a41c-93d8513241e5 req-3fae9166-10bf-4867-93a1-6084edce1ac0 service nova] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] No waiting events found dispatching network-vif-plugged-4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:19:42 user nova-compute[70975]: WARNING nova.compute.manager [req-f764e83b-4914-478f-a41c-93d8513241e5 req-3fae9166-10bf-4867-93a1-6084edce1ac0 service nova] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Received unexpected event network-vif-plugged-4aa3a6dd-3c8a-4cb6-8fca-beb4d7988010 for instance with vm_state deleted and task_state None. Apr 18 16:19:42 user nova-compute[70975]: DEBUG nova.network.neutron [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Successfully updated port: 4d7beeed-1a0b-490b-a788-3b8442f86758 {{(pid=70975) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 18 16:19:42 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Acquiring lock "refresh_cache-f5496c5f-292e-4912-991b-f834009e51a1" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:19:42 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Acquired lock "refresh_cache-f5496c5f-292e-4912-991b-f834009e51a1" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:19:42 user nova-compute[70975]: DEBUG nova.network.neutron [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Building network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 18 16:19:43 user nova-compute[70975]: DEBUG nova.network.neutron [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Instance cache missing network info. {{(pid=70975) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 18 16:19:43 user nova-compute[70975]: DEBUG nova.network.neutron [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Updating instance_info_cache with network_info: [{"id": "4d7beeed-1a0b-490b-a788-3b8442f86758", "address": "fa:16:3e:fa:7c:e4", "network": {"id": "51cddd0f-0e4b-4d37-be40-ce5592263bc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1803491920-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f9987eeaa6b24ca48e80e8d5318f02ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d7beeed-1a", "ovs_interfaceid": "4d7beeed-1a0b-490b-a788-3b8442f86758", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:19:43 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Releasing lock "refresh_cache-f5496c5f-292e-4912-991b-f834009e51a1" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:19:43 user nova-compute[70975]: DEBUG nova.compute.manager [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Instance network_info: |[{"id": "4d7beeed-1a0b-490b-a788-3b8442f86758", "address": "fa:16:3e:fa:7c:e4", "network": {"id": "51cddd0f-0e4b-4d37-be40-ce5592263bc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1803491920-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f9987eeaa6b24ca48e80e8d5318f02ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d7beeed-1a", "ovs_interfaceid": "4d7beeed-1a0b-490b-a788-3b8442f86758", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 18 16:19:43 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Start _get_guest_xml network_info=[{"id": "4d7beeed-1a0b-490b-a788-3b8442f86758", "address": "fa:16:3e:fa:7c:e4", "network": {"id": "51cddd0f-0e4b-4d37-be40-ce5592263bc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1803491920-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f9987eeaa6b24ca48e80e8d5318f02ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d7beeed-1a", "ovs_interfaceid": "4d7beeed-1a0b-490b-a788-3b8442f86758", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encrypted': False, 'device_type': 'disk', 'encryption_secret_uuid': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'b11a20de-f82a-4158-b53e-0a0c7a1552cb'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 18 16:19:43 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:19:43 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:19:43 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70975) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 18 16:19:43 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-18T16:11:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=), allow threads: True {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 18 16:19:43 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Flavor limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 18 16:19:43 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Image limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 18 16:19:43 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Flavor pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 18 16:19:43 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Image pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 18 16:19:43 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 18 16:19:43 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 18 16:19:43 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 18 16:19:43 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Got 1 possible topologies {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 18 16:19:43 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 18 16:19:43 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 18 16:19:43 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:19:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1134586344',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-1134586344',id=15,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ/t2zRXHckg0I8s8IBMQHTN6DHAOZW32I+dgNI9pg+HkbIaOsxkar0QwwPFIjcioaOE616z5xMRZ4Ihxh2dkemRrU9uEbk/jjZSUa1gm6kQJDS4/DyUt2ZBHtNG3kEG3g==',key_name='tempest-keypair-1613556039',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f9987eeaa6b24ca48e80e8d5318f02ac',ramdisk_id='',reservation_id='r-7mk95l5u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-1663710151',owner_user_name='tempest-AttachVolumeShelveTestJSON-1663710151-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:19:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='73a99bbf510f4f67bb7a35901ba3edc5',uuid=f5496c5f-292e-4912-991b-f834009e51a1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4d7beeed-1a0b-490b-a788-3b8442f86758", "address": "fa:16:3e:fa:7c:e4", "network": {"id": "51cddd0f-0e4b-4d37-be40-ce5592263bc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1803491920-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f9987eeaa6b24ca48e80e8d5318f02ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d7beeed-1a", "ovs_interfaceid": "4d7beeed-1a0b-490b-a788-3b8442f86758", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70975) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 18 16:19:43 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Converting VIF {"id": "4d7beeed-1a0b-490b-a788-3b8442f86758", "address": "fa:16:3e:fa:7c:e4", "network": {"id": "51cddd0f-0e4b-4d37-be40-ce5592263bc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1803491920-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f9987eeaa6b24ca48e80e8d5318f02ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d7beeed-1a", "ovs_interfaceid": "4d7beeed-1a0b-490b-a788-3b8442f86758", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:19:43 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:7c:e4,bridge_name='br-int',has_traffic_filtering=True,id=4d7beeed-1a0b-490b-a788-3b8442f86758,network=Network(51cddd0f-0e4b-4d37-be40-ce5592263bc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d7beeed-1a') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:19:43 user nova-compute[70975]: DEBUG nova.objects.instance [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lazy-loading 'pci_devices' on Instance uuid f5496c5f-292e-4912-991b-f834009e51a1 {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:19:43 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: f5496c5f-292e-4912-991b-f834009e51a1] End _get_guest_xml xml= Apr 18 16:19:43 user nova-compute[70975]: f5496c5f-292e-4912-991b-f834009e51a1 Apr 18 16:19:43 user nova-compute[70975]: instance-0000000f Apr 18 16:19:43 user nova-compute[70975]: 131072 Apr 18 16:19:43 user nova-compute[70975]: 1 Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: tempest-AttachVolumeShelveTestJSON-server-1134586344 Apr 18 16:19:43 user nova-compute[70975]: 2023-04-18 16:19:43 Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: 128 Apr 18 16:19:43 user nova-compute[70975]: 1 Apr 18 16:19:43 user nova-compute[70975]: 0 Apr 18 16:19:43 user nova-compute[70975]: 0 Apr 18 16:19:43 user nova-compute[70975]: 1 Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: tempest-AttachVolumeShelveTestJSON-1663710151-project-member Apr 18 16:19:43 user nova-compute[70975]: tempest-AttachVolumeShelveTestJSON-1663710151 Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: OpenStack Foundation Apr 18 16:19:43 user nova-compute[70975]: OpenStack Nova Apr 18 16:19:43 user nova-compute[70975]: 0.0.0 Apr 18 16:19:43 user nova-compute[70975]: f5496c5f-292e-4912-991b-f834009e51a1 Apr 18 16:19:43 user nova-compute[70975]: f5496c5f-292e-4912-991b-f834009e51a1 Apr 18 16:19:43 user nova-compute[70975]: Virtual Machine Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: hvm Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Nehalem Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: /dev/urandom Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: Apr 18 16:19:43 user nova-compute[70975]: {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 18 16:19:43 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:19:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1134586344',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-1134586344',id=15,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ/t2zRXHckg0I8s8IBMQHTN6DHAOZW32I+dgNI9pg+HkbIaOsxkar0QwwPFIjcioaOE616z5xMRZ4Ihxh2dkemRrU9uEbk/jjZSUa1gm6kQJDS4/DyUt2ZBHtNG3kEG3g==',key_name='tempest-keypair-1613556039',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f9987eeaa6b24ca48e80e8d5318f02ac',ramdisk_id='',reservation_id='r-7mk95l5u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-1663710151',owner_user_name='tempest-AttachVolumeShelveTestJSON-1663710151-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:19:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='73a99bbf510f4f67bb7a35901ba3edc5',uuid=f5496c5f-292e-4912-991b-f834009e51a1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4d7beeed-1a0b-490b-a788-3b8442f86758", "address": "fa:16:3e:fa:7c:e4", "network": {"id": "51cddd0f-0e4b-4d37-be40-ce5592263bc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1803491920-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f9987eeaa6b24ca48e80e8d5318f02ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d7beeed-1a", "ovs_interfaceid": "4d7beeed-1a0b-490b-a788-3b8442f86758", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 18 16:19:43 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Converting VIF {"id": "4d7beeed-1a0b-490b-a788-3b8442f86758", "address": "fa:16:3e:fa:7c:e4", "network": {"id": "51cddd0f-0e4b-4d37-be40-ce5592263bc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1803491920-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f9987eeaa6b24ca48e80e8d5318f02ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d7beeed-1a", "ovs_interfaceid": "4d7beeed-1a0b-490b-a788-3b8442f86758", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:19:43 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fa:7c:e4,bridge_name='br-int',has_traffic_filtering=True,id=4d7beeed-1a0b-490b-a788-3b8442f86758,network=Network(51cddd0f-0e4b-4d37-be40-ce5592263bc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d7beeed-1a') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:19:43 user nova-compute[70975]: DEBUG os_vif [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:7c:e4,bridge_name='br-int',has_traffic_filtering=True,id=4d7beeed-1a0b-490b-a788-3b8442f86758,network=Network(51cddd0f-0e4b-4d37-be40-ce5592263bc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d7beeed-1a') {{(pid=70975) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 18 16:19:43 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:43 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:19:43 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 18 16:19:43 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:43 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4d7beeed-1a, may_exist=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:19:43 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4d7beeed-1a, col_values=(('external_ids', {'iface-id': '4d7beeed-1a0b-490b-a788-3b8442f86758', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fa:7c:e4', 'vm-uuid': 'f5496c5f-292e-4912-991b-f834009e51a1'}),)) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:19:43 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:43 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:19:43 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:43 user nova-compute[70975]: INFO os_vif [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fa:7c:e4,bridge_name='br-int',has_traffic_filtering=True,id=4d7beeed-1a0b-490b-a788-3b8442f86758,network=Network(51cddd0f-0e4b-4d37-be40-ce5592263bc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d7beeed-1a') Apr 18 16:19:43 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] No BDM found with device name vda, not building metadata. {{(pid=70975) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 18 16:19:43 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] No VIF found with MAC fa:16:3e:fa:7c:e4, not building metadata {{(pid=70975) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 18 16:19:44 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:44 user nova-compute[70975]: DEBUG nova.compute.manager [req-1119b499-48e6-4758-96ac-b5930046e93e req-4d6e6ad4-ce35-4b61-afbd-53248a8ce79d service nova] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Received event network-changed-4d7beeed-1a0b-490b-a788-3b8442f86758 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:19:44 user nova-compute[70975]: DEBUG nova.compute.manager [req-1119b499-48e6-4758-96ac-b5930046e93e req-4d6e6ad4-ce35-4b61-afbd-53248a8ce79d service nova] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Refreshing instance network info cache due to event network-changed-4d7beeed-1a0b-490b-a788-3b8442f86758. {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 18 16:19:44 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-1119b499-48e6-4758-96ac-b5930046e93e req-4d6e6ad4-ce35-4b61-afbd-53248a8ce79d service nova] Acquiring lock "refresh_cache-f5496c5f-292e-4912-991b-f834009e51a1" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:19:44 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-1119b499-48e6-4758-96ac-b5930046e93e req-4d6e6ad4-ce35-4b61-afbd-53248a8ce79d service nova] Acquired lock "refresh_cache-f5496c5f-292e-4912-991b-f834009e51a1" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:19:44 user nova-compute[70975]: DEBUG nova.network.neutron [req-1119b499-48e6-4758-96ac-b5930046e93e req-4d6e6ad4-ce35-4b61-afbd-53248a8ce79d service nova] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Refreshing network info cache for port 4d7beeed-1a0b-490b-a788-3b8442f86758 {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 18 16:19:44 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:44 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:44 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:44 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:45 user nova-compute[70975]: DEBUG nova.compute.manager [req-47e5a769-b394-4163-91c8-837b81755ed7 req-c2c094fb-d4dd-40c6-8c72-90e31bd78a05 service nova] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Received event network-vif-plugged-4d7beeed-1a0b-490b-a788-3b8442f86758 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:19:45 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-47e5a769-b394-4163-91c8-837b81755ed7 req-c2c094fb-d4dd-40c6-8c72-90e31bd78a05 service nova] Acquiring lock "f5496c5f-292e-4912-991b-f834009e51a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:19:45 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-47e5a769-b394-4163-91c8-837b81755ed7 req-c2c094fb-d4dd-40c6-8c72-90e31bd78a05 service nova] Lock "f5496c5f-292e-4912-991b-f834009e51a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:19:45 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-47e5a769-b394-4163-91c8-837b81755ed7 req-c2c094fb-d4dd-40c6-8c72-90e31bd78a05 service nova] Lock "f5496c5f-292e-4912-991b-f834009e51a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:19:45 user nova-compute[70975]: DEBUG nova.compute.manager [req-47e5a769-b394-4163-91c8-837b81755ed7 req-c2c094fb-d4dd-40c6-8c72-90e31bd78a05 service nova] [instance: f5496c5f-292e-4912-991b-f834009e51a1] No waiting events found dispatching network-vif-plugged-4d7beeed-1a0b-490b-a788-3b8442f86758 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:19:45 user nova-compute[70975]: WARNING nova.compute.manager [req-47e5a769-b394-4163-91c8-837b81755ed7 req-c2c094fb-d4dd-40c6-8c72-90e31bd78a05 service nova] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Received unexpected event network-vif-plugged-4d7beeed-1a0b-490b-a788-3b8442f86758 for instance with vm_state building and task_state spawning. Apr 18 16:19:45 user nova-compute[70975]: DEBUG nova.network.neutron [req-1119b499-48e6-4758-96ac-b5930046e93e req-4d6e6ad4-ce35-4b61-afbd-53248a8ce79d service nova] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Updated VIF entry in instance network info cache for port 4d7beeed-1a0b-490b-a788-3b8442f86758. {{(pid=70975) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 18 16:19:45 user nova-compute[70975]: DEBUG nova.network.neutron [req-1119b499-48e6-4758-96ac-b5930046e93e req-4d6e6ad4-ce35-4b61-afbd-53248a8ce79d service nova] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Updating instance_info_cache with network_info: [{"id": "4d7beeed-1a0b-490b-a788-3b8442f86758", "address": "fa:16:3e:fa:7c:e4", "network": {"id": "51cddd0f-0e4b-4d37-be40-ce5592263bc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1803491920-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f9987eeaa6b24ca48e80e8d5318f02ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d7beeed-1a", "ovs_interfaceid": "4d7beeed-1a0b-490b-a788-3b8442f86758", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:19:45 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-1119b499-48e6-4758-96ac-b5930046e93e req-4d6e6ad4-ce35-4b61-afbd-53248a8ce79d service nova] Releasing lock "refresh_cache-f5496c5f-292e-4912-991b-f834009e51a1" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:19:45 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:45 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:45 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:45 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-9d5ca017-8acf-477c-9870-db4667cd991f tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Acquiring lock "da82d905-1ca1-403d-9598-7561e69b9704" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:19:45 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-9d5ca017-8acf-477c-9870-db4667cd991f tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lock "da82d905-1ca1-403d-9598-7561e69b9704" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:19:45 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-9d5ca017-8acf-477c-9870-db4667cd991f tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Acquiring lock "da82d905-1ca1-403d-9598-7561e69b9704-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:19:45 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-9d5ca017-8acf-477c-9870-db4667cd991f tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lock "da82d905-1ca1-403d-9598-7561e69b9704-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:19:45 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-9d5ca017-8acf-477c-9870-db4667cd991f tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lock "da82d905-1ca1-403d-9598-7561e69b9704-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:19:45 user nova-compute[70975]: INFO nova.compute.manager [None req-9d5ca017-8acf-477c-9870-db4667cd991f tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Terminating instance Apr 18 16:19:45 user nova-compute[70975]: DEBUG nova.compute.manager [None req-9d5ca017-8acf-477c-9870-db4667cd991f tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Start destroying the instance on the hypervisor. {{(pid=70975) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 18 16:19:45 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:45 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:45 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:45 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:46 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Instance destroyed successfully. Apr 18 16:19:46 user nova-compute[70975]: DEBUG nova.objects.instance [None req-9d5ca017-8acf-477c-9870-db4667cd991f tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lazy-loading 'resources' on Instance uuid da82d905-1ca1-403d-9598-7561e69b9704 {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:19:46 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-9d5ca017-8acf-477c-9870-db4667cd991f tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:14:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1478486128',display_name='tempest-ServerRescueNegativeTestJSON-server-1478486128',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-1478486128',id=2,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-18T16:14:18Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='261e8ba82d9e4203917afb0241a3b4fc',ramdisk_id='',reservation_id='r-vu07y5ik',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerRescueNegativeTestJSON-1586888284',owner_user_name='tempest-ServerRescueNegativeTestJSON-1586888284-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-18T16:14:19Z,user_data=None,user_id='a8a3f45f9c6c431781fb582b8da22b0b',uuid=da82d905-1ca1-403d-9598-7561e69b9704,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "894e80db-f051-4b32-adc8-e3afa321eb34", "address": "fa:16:3e:ad:ba:71", "network": {"id": "1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1814061150-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "261e8ba82d9e4203917afb0241a3b4fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap894e80db-f0", "ovs_interfaceid": "894e80db-f051-4b32-adc8-e3afa321eb34", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 18 16:19:46 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-9d5ca017-8acf-477c-9870-db4667cd991f tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Converting VIF {"id": "894e80db-f051-4b32-adc8-e3afa321eb34", "address": "fa:16:3e:ad:ba:71", "network": {"id": "1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1814061150-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "261e8ba82d9e4203917afb0241a3b4fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap894e80db-f0", "ovs_interfaceid": "894e80db-f051-4b32-adc8-e3afa321eb34", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:19:46 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-9d5ca017-8acf-477c-9870-db4667cd991f tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ad:ba:71,bridge_name='br-int',has_traffic_filtering=True,id=894e80db-f051-4b32-adc8-e3afa321eb34,network=Network(1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap894e80db-f0') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:19:46 user nova-compute[70975]: DEBUG os_vif [None req-9d5ca017-8acf-477c-9870-db4667cd991f tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ad:ba:71,bridge_name='br-int',has_traffic_filtering=True,id=894e80db-f051-4b32-adc8-e3afa321eb34,network=Network(1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap894e80db-f0') {{(pid=70975) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 18 16:19:46 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:46 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap894e80db-f0, bridge=br-int, if_exists=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:19:46 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:46 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:19:46 user nova-compute[70975]: INFO os_vif [None req-9d5ca017-8acf-477c-9870-db4667cd991f tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ad:ba:71,bridge_name='br-int',has_traffic_filtering=True,id=894e80db-f051-4b32-adc8-e3afa321eb34,network=Network(1bff5cf0-81ae-4bf0-88ab-fd844ee78e3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap894e80db-f0') Apr 18 16:19:46 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-9d5ca017-8acf-477c-9870-db4667cd991f tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Deleting instance files /opt/stack/data/nova/instances/da82d905-1ca1-403d-9598-7561e69b9704_del Apr 18 16:19:46 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-9d5ca017-8acf-477c-9870-db4667cd991f tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Deletion of /opt/stack/data/nova/instances/da82d905-1ca1-403d-9598-7561e69b9704_del complete Apr 18 16:19:46 user nova-compute[70975]: INFO nova.compute.manager [None req-9d5ca017-8acf-477c-9870-db4667cd991f tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Took 0.86 seconds to destroy the instance on the hypervisor. Apr 18 16:19:46 user nova-compute[70975]: DEBUG oslo.service.loopingcall [None req-9d5ca017-8acf-477c-9870-db4667cd991f tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70975) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 18 16:19:46 user nova-compute[70975]: DEBUG nova.compute.manager [-] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Deallocating network for instance {{(pid=70975) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 18 16:19:46 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: da82d905-1ca1-403d-9598-7561e69b9704] deallocate_for_instance() {{(pid=70975) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 18 16:19:46 user nova-compute[70975]: DEBUG nova.compute.manager [req-e3589629-98e2-4d4d-bd14-e5daf32133f1 req-64ebb041-5861-4221-9bc8-6c5c0732c188 service nova] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Received event network-vif-unplugged-894e80db-f051-4b32-adc8-e3afa321eb34 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:19:46 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-e3589629-98e2-4d4d-bd14-e5daf32133f1 req-64ebb041-5861-4221-9bc8-6c5c0732c188 service nova] Acquiring lock "da82d905-1ca1-403d-9598-7561e69b9704-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:19:46 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-e3589629-98e2-4d4d-bd14-e5daf32133f1 req-64ebb041-5861-4221-9bc8-6c5c0732c188 service nova] Lock "da82d905-1ca1-403d-9598-7561e69b9704-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:19:46 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-e3589629-98e2-4d4d-bd14-e5daf32133f1 req-64ebb041-5861-4221-9bc8-6c5c0732c188 service nova] Lock "da82d905-1ca1-403d-9598-7561e69b9704-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:19:46 user nova-compute[70975]: DEBUG nova.compute.manager [req-e3589629-98e2-4d4d-bd14-e5daf32133f1 req-64ebb041-5861-4221-9bc8-6c5c0732c188 service nova] [instance: da82d905-1ca1-403d-9598-7561e69b9704] No waiting events found dispatching network-vif-unplugged-894e80db-f051-4b32-adc8-e3afa321eb34 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:19:46 user nova-compute[70975]: DEBUG nova.compute.manager [req-e3589629-98e2-4d4d-bd14-e5daf32133f1 req-64ebb041-5861-4221-9bc8-6c5c0732c188 service nova] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Received event network-vif-unplugged-894e80db-f051-4b32-adc8-e3afa321eb34 for instance with task_state deleting. {{(pid=70975) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 18 16:19:46 user nova-compute[70975]: DEBUG nova.compute.manager [req-e3589629-98e2-4d4d-bd14-e5daf32133f1 req-64ebb041-5861-4221-9bc8-6c5c0732c188 service nova] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Received event network-vif-plugged-894e80db-f051-4b32-adc8-e3afa321eb34 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:19:46 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-e3589629-98e2-4d4d-bd14-e5daf32133f1 req-64ebb041-5861-4221-9bc8-6c5c0732c188 service nova] Acquiring lock "da82d905-1ca1-403d-9598-7561e69b9704-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:19:46 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-e3589629-98e2-4d4d-bd14-e5daf32133f1 req-64ebb041-5861-4221-9bc8-6c5c0732c188 service nova] Lock "da82d905-1ca1-403d-9598-7561e69b9704-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:19:46 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-e3589629-98e2-4d4d-bd14-e5daf32133f1 req-64ebb041-5861-4221-9bc8-6c5c0732c188 service nova] Lock "da82d905-1ca1-403d-9598-7561e69b9704-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:19:46 user nova-compute[70975]: DEBUG nova.compute.manager [req-e3589629-98e2-4d4d-bd14-e5daf32133f1 req-64ebb041-5861-4221-9bc8-6c5c0732c188 service nova] [instance: da82d905-1ca1-403d-9598-7561e69b9704] No waiting events found dispatching network-vif-plugged-894e80db-f051-4b32-adc8-e3afa321eb34 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:19:46 user nova-compute[70975]: WARNING nova.compute.manager [req-e3589629-98e2-4d4d-bd14-e5daf32133f1 req-64ebb041-5861-4221-9bc8-6c5c0732c188 service nova] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Received unexpected event network-vif-plugged-894e80db-f051-4b32-adc8-e3afa321eb34 for instance with vm_state active and task_state deleting. Apr 18 16:19:46 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Resumed> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:19:46 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: f5496c5f-292e-4912-991b-f834009e51a1] VM Resumed (Lifecycle Event) Apr 18 16:19:46 user nova-compute[70975]: DEBUG nova.compute.manager [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Instance event wait completed in 0 seconds for {{(pid=70975) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 18 16:19:46 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Guest created on hypervisor {{(pid=70975) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 18 16:19:46 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Instance spawned successfully. Apr 18 16:19:46 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 18 16:19:46 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:19:47 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:19:47 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Found default for hw_cdrom_bus of ide {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:19:47 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Found default for hw_disk_bus of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:19:47 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Found default for hw_input_bus of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:19:47 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Found default for hw_pointer_model of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:19:47 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Found default for hw_video_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:19:47 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Found default for hw_vif_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:19:47 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: f5496c5f-292e-4912-991b-f834009e51a1] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:19:47 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Started> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:19:47 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: f5496c5f-292e-4912-991b-f834009e51a1] VM Started (Lifecycle Event) Apr 18 16:19:47 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:19:47 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:19:47 user nova-compute[70975]: INFO nova.compute.manager [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Took 6.26 seconds to spawn the instance on the hypervisor. Apr 18 16:19:47 user nova-compute[70975]: DEBUG nova.compute.manager [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:19:47 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: f5496c5f-292e-4912-991b-f834009e51a1] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:19:47 user nova-compute[70975]: DEBUG nova.compute.manager [req-2188f308-e36d-4709-9fa7-6f5e31a11ceb req-e033138a-a914-496e-92ed-01349fa2d24a service nova] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Received event network-vif-plugged-4d7beeed-1a0b-490b-a788-3b8442f86758 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:19:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-2188f308-e36d-4709-9fa7-6f5e31a11ceb req-e033138a-a914-496e-92ed-01349fa2d24a service nova] Acquiring lock "f5496c5f-292e-4912-991b-f834009e51a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:19:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-2188f308-e36d-4709-9fa7-6f5e31a11ceb req-e033138a-a914-496e-92ed-01349fa2d24a service nova] Lock "f5496c5f-292e-4912-991b-f834009e51a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:19:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-2188f308-e36d-4709-9fa7-6f5e31a11ceb req-e033138a-a914-496e-92ed-01349fa2d24a service nova] Lock "f5496c5f-292e-4912-991b-f834009e51a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:19:47 user nova-compute[70975]: DEBUG nova.compute.manager [req-2188f308-e36d-4709-9fa7-6f5e31a11ceb req-e033138a-a914-496e-92ed-01349fa2d24a service nova] [instance: f5496c5f-292e-4912-991b-f834009e51a1] No waiting events found dispatching network-vif-plugged-4d7beeed-1a0b-490b-a788-3b8442f86758 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:19:47 user nova-compute[70975]: WARNING nova.compute.manager [req-2188f308-e36d-4709-9fa7-6f5e31a11ceb req-e033138a-a914-496e-92ed-01349fa2d24a service nova] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Received unexpected event network-vif-plugged-4d7beeed-1a0b-490b-a788-3b8442f86758 for instance with vm_state building and task_state spawning. Apr 18 16:19:47 user nova-compute[70975]: INFO nova.compute.manager [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Took 6.92 seconds to build instance. Apr 18 16:19:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-8bda2dd2-aa55-4d95-a808-696766ac749c tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lock "f5496c5f-292e-4912-991b-f834009e51a1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.025s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:19:47 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Updating instance_info_cache with network_info: [] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:19:47 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Took 0.61 seconds to deallocate network for instance. Apr 18 16:19:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-9d5ca017-8acf-477c-9870-db4667cd991f tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:19:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-9d5ca017-8acf-477c-9870-db4667cd991f tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:19:47 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-9d5ca017-8acf-477c-9870-db4667cd991f tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:19:47 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-9d5ca017-8acf-477c-9870-db4667cd991f tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:19:47 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-9d5ca017-8acf-477c-9870-db4667cd991f tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.193s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:19:47 user nova-compute[70975]: INFO nova.scheduler.client.report [None req-9d5ca017-8acf-477c-9870-db4667cd991f tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Deleted allocations for instance da82d905-1ca1-403d-9598-7561e69b9704 Apr 18 16:19:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-9d5ca017-8acf-477c-9870-db4667cd991f tempest-ServerRescueNegativeTestJSON-1586888284 tempest-ServerRescueNegativeTestJSON-1586888284-project-member] Lock "da82d905-1ca1-403d-9598-7561e69b9704" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.873s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:19:48 user nova-compute[70975]: DEBUG nova.compute.manager [req-20500898-d6f8-4bda-956b-817e0278f19f req-d4df56e8-4f5c-4d21-ba82-a2437934af2e service nova] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Received event network-vif-deleted-894e80db-f051-4b32-adc8-e3afa321eb34 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:19:51 user nova-compute[70975]: DEBUG nova.compute.manager [None req-3db2d662-4f4d-4e06-9a1c-20a769ee95b7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:19:51 user nova-compute[70975]: INFO nova.compute.manager [None req-3db2d662-4f4d-4e06-9a1c-20a769ee95b7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] instance snapshotting Apr 18 16:19:51 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-3db2d662-4f4d-4e06-9a1c-20a769ee95b7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Beginning live snapshot process Apr 18 16:19:51 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-3db2d662-4f4d-4e06-9a1c-20a769ee95b7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json -f qcow2 {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:19:51 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-3db2d662-4f4d-4e06-9a1c-20a769ee95b7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json -f qcow2" returned: 0 in 0.145s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:19:51 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-3db2d662-4f4d-4e06-9a1c-20a769ee95b7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json -f qcow2 {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:19:51 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:51 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-3db2d662-4f4d-4e06-9a1c-20a769ee95b7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json -f qcow2" returned: 0 in 0.151s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:19:51 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-3db2d662-4f4d-4e06-9a1c-20a769ee95b7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:19:51 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-3db2d662-4f4d-4e06-9a1c-20a769ee95b7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.136s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:19:51 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-3db2d662-4f4d-4e06-9a1c-20a769ee95b7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmpw15mrm2k/40a59c392fe54484aed3ff9900a1b777.delta 1073741824 {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:19:51 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-3db2d662-4f4d-4e06-9a1c-20a769ee95b7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmpw15mrm2k/40a59c392fe54484aed3ff9900a1b777.delta 1073741824" returned: 0 in 0.048s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:19:51 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-3db2d662-4f4d-4e06-9a1c-20a769ee95b7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Quiescing instance not available: QEMU guest agent is not enabled. Apr 18 16:19:52 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:52 user nova-compute[70975]: DEBUG nova.virt.libvirt.guest [None req-3db2d662-4f4d-4e06-9a1c-20a769ee95b7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] COPY block job progress, current cursor: 0 final cursor: 43778048 {{(pid=70975) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 18 16:19:53 user nova-compute[70975]: DEBUG nova.virt.libvirt.guest [None req-3db2d662-4f4d-4e06-9a1c-20a769ee95b7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] COPY block job progress, current cursor: 43778048 final cursor: 43778048 {{(pid=70975) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 18 16:19:53 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-3db2d662-4f4d-4e06-9a1c-20a769ee95b7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Skipping quiescing instance: QEMU guest agent is not enabled. Apr 18 16:19:53 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:53 user nova-compute[70975]: DEBUG nova.privsep.utils [None req-3db2d662-4f4d-4e06-9a1c-20a769ee95b7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=70975) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 18 16:19:53 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-3db2d662-4f4d-4e06-9a1c-20a769ee95b7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmpw15mrm2k/40a59c392fe54484aed3ff9900a1b777.delta /opt/stack/data/nova/instances/snapshots/tmpw15mrm2k/40a59c392fe54484aed3ff9900a1b777 {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:19:53 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-3db2d662-4f4d-4e06-9a1c-20a769ee95b7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmpw15mrm2k/40a59c392fe54484aed3ff9900a1b777.delta /opt/stack/data/nova/instances/snapshots/tmpw15mrm2k/40a59c392fe54484aed3ff9900a1b777" returned: 0 in 0.347s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:19:53 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-3db2d662-4f4d-4e06-9a1c-20a769ee95b7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Snapshot extracted, beginning image upload Apr 18 16:19:55 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:55 user nova-compute[70975]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:19:55 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] VM Stopped (Lifecycle Event) Apr 18 16:19:55 user nova-compute[70975]: DEBUG nova.compute.manager [None req-3c73ce3c-5288-4a99-8829-30dbef94cb8d None None] [instance: b71bd3c1-da58-4cb0-abc3-650e11b9d4ce] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:19:56 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-3db2d662-4f4d-4e06-9a1c-20a769ee95b7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Snapshot image upload complete Apr 18 16:19:56 user nova-compute[70975]: INFO nova.compute.manager [None req-3db2d662-4f4d-4e06-9a1c-20a769ee95b7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Took 5.22 seconds to snapshot the instance on the hypervisor. Apr 18 16:19:56 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:57 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:19:57 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:19:57 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Cleaning up deleted instances {{(pid=70975) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 18 16:19:57 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] There are 0 instances to clean {{(pid=70975) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 18 16:19:57 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:19:58 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Acquiring lock "f6d6085d-9e15-4e29-ab90-3a8928971324" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:19:58 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "f6d6085d-9e15-4e29-ab90-3a8928971324" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:19:58 user nova-compute[70975]: DEBUG nova.compute.manager [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Starting instance... {{(pid=70975) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 18 16:19:58 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:19:58 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:19:58 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70975) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 18 16:19:58 user nova-compute[70975]: INFO nova.compute.claims [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Claim successful on node user Apr 18 16:19:58 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:19:58 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:19:58 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.299s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:19:58 user nova-compute[70975]: DEBUG nova.compute.manager [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Start building networks asynchronously for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 18 16:19:58 user nova-compute[70975]: DEBUG nova.compute.manager [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Allocating IP information in the background. {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 18 16:19:58 user nova-compute[70975]: DEBUG nova.network.neutron [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] allocate_for_instance() {{(pid=70975) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 18 16:19:58 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 18 16:19:58 user nova-compute[70975]: DEBUG nova.compute.manager [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Start building block device mappings for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 18 16:19:59 user nova-compute[70975]: DEBUG nova.compute.manager [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Start spawning the instance on the hypervisor. {{(pid=70975) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 18 16:19:59 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Creating instance directory {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 18 16:19:59 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Creating image(s) Apr 18 16:19:59 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Acquiring lock "/opt/stack/data/nova/instances/f6d6085d-9e15-4e29-ab90-3a8928971324/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:19:59 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "/opt/stack/data/nova/instances/f6d6085d-9e15-4e29-ab90-3a8928971324/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:19:59 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "/opt/stack/data/nova/instances/f6d6085d-9e15-4e29-ab90-3a8928971324/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:19:59 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:19:59 user nova-compute[70975]: DEBUG nova.policy [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'af90e17ec027463fa8793e8539c39e13', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6b4e8d8797be4c0e91b1401538f2eba8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70975) authorize /opt/stack/nova/nova/policy.py:203}} Apr 18 16:19:59 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.127s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:19:59 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Acquiring lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:19:59 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:19:59 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:19:59 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.139s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:19:59 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/f6d6085d-9e15-4e29-ab90-3a8928971324/disk 1073741824 {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:19:59 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/f6d6085d-9e15-4e29-ab90-3a8928971324/disk 1073741824" returned: 0 in 0.051s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:19:59 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.197s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:19:59 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:19:59 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:19:59 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70975) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 18 16:19:59 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.138s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:19:59 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Checking if we can resize image /opt/stack/data/nova/instances/f6d6085d-9e15-4e29-ab90-3a8928971324/disk. size=1073741824 {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 18 16:19:59 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f6d6085d-9e15-4e29-ab90-3a8928971324/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:19:59 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f6d6085d-9e15-4e29-ab90-3a8928971324/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:19:59 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Cannot resize image /opt/stack/data/nova/instances/f6d6085d-9e15-4e29-ab90-3a8928971324/disk to a smaller size. {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 18 16:19:59 user nova-compute[70975]: DEBUG nova.objects.instance [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lazy-loading 'migration_context' on Instance uuid f6d6085d-9e15-4e29-ab90-3a8928971324 {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:19:59 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Created local disks {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 18 16:19:59 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Ensure instance console log exists: /opt/stack/data/nova/instances/f6d6085d-9e15-4e29-ab90-3a8928971324/console.log {{(pid=70975) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 18 16:19:59 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:19:59 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:19:59 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:20:00 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:20:00 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:20:00 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:20:00 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Cleaning up deleted instances with incomplete migration {{(pid=70975) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 18 16:20:00 user nova-compute[70975]: DEBUG nova.network.neutron [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Successfully created port: fb818849-31a0-4c25-b42d-ca19fe250ca6 {{(pid=70975) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 18 16:20:01 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:01 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:20:01 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Starting heal instance info cache {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 18 16:20:01 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Rebuilding the list of instances to heal {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 18 16:20:01 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Skipping network cache update for instance because it is Building. {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9805}} Apr 18 16:20:01 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "refresh_cache-1b530349-680e-4def-86ef-29c340efa175" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:20:01 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquired lock "refresh_cache-1b530349-680e-4def-86ef-29c340efa175" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:20:01 user nova-compute[70975]: DEBUG nova.network.neutron [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: 1b530349-680e-4def-86ef-29c340efa175] Forcefully refreshing network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 18 16:20:01 user nova-compute[70975]: DEBUG nova.objects.instance [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lazy-loading 'info_cache' on Instance uuid 1b530349-680e-4def-86ef-29c340efa175 {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:20:01 user nova-compute[70975]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:20:01 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: da82d905-1ca1-403d-9598-7561e69b9704] VM Stopped (Lifecycle Event) Apr 18 16:20:01 user nova-compute[70975]: DEBUG nova.compute.manager [None req-c532ac10-219a-4225-94ea-468bfe59ef43 None None] [instance: da82d905-1ca1-403d-9598-7561e69b9704] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:20:01 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:02 user nova-compute[70975]: DEBUG nova.network.neutron [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: 1b530349-680e-4def-86ef-29c340efa175] Updating instance_info_cache with network_info: [{"id": "64d26c20-add4-4a63-bace-6a3678032692", "address": "fa:16:3e:33:ec:46", "network": {"id": "f5beaf4a-eeaf-454b-bde5-dd5e1f15e9dd", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-215585786-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "caa61b19cc4e4cd4bb7d41291c40ef1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap64d26c20-ad", "ovs_interfaceid": "64d26c20-add4-4a63-bace-6a3678032692", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:20:02 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Releasing lock "refresh_cache-1b530349-680e-4def-86ef-29c340efa175" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:20:02 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: 1b530349-680e-4def-86ef-29c340efa175] Updated the network info_cache for instance {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 18 16:20:02 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:20:02 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager.update_available_resource {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:20:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:02 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:20:02 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:20:02 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:20:02 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Auditing locally available compute resources for user (node: user) {{(pid=70975) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 18 16:20:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:20:02 user nova-compute[70975]: DEBUG nova.network.neutron [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Successfully updated port: fb818849-31a0-4c25-b42d-ca19fe250ca6 {{(pid=70975) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 18 16:20:02 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Acquiring lock "refresh_cache-f6d6085d-9e15-4e29-ab90-3a8928971324" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:20:02 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Acquired lock "refresh_cache-f6d6085d-9e15-4e29-ab90-3a8928971324" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:20:02 user nova-compute[70975]: DEBUG nova.network.neutron [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Building network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 18 16:20:02 user nova-compute[70975]: DEBUG nova.compute.manager [req-e4b8e942-ddf5-47f5-a496-9dd3deae5bf3 req-8cf5a286-1a9d-4581-801a-f8d73e412c80 service nova] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Received event network-changed-fb818849-31a0-4c25-b42d-ca19fe250ca6 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:20:02 user nova-compute[70975]: DEBUG nova.compute.manager [req-e4b8e942-ddf5-47f5-a496-9dd3deae5bf3 req-8cf5a286-1a9d-4581-801a-f8d73e412c80 service nova] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Refreshing instance network info cache due to event network-changed-fb818849-31a0-4c25-b42d-ca19fe250ca6. {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 18 16:20:02 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-e4b8e942-ddf5-47f5-a496-9dd3deae5bf3 req-8cf5a286-1a9d-4581-801a-f8d73e412c80 service nova] Acquiring lock "refresh_cache-f6d6085d-9e15-4e29-ab90-3a8928971324" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:20:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:20:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:20:02 user nova-compute[70975]: DEBUG nova.network.neutron [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Instance cache missing network info. {{(pid=70975) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 18 16:20:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:20:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:20:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:20:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json" returned: 0 in 0.127s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f5496c5f-292e-4912-991b-f834009e51a1/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG nova.network.neutron [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Updating instance_info_cache with network_info: [{"id": "fb818849-31a0-4c25-b42d-ca19fe250ca6", "address": "fa:16:3e:54:c5:ae", "network": {"id": "02aca424-2923-404b-9c66-76bec89f82b7", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1255258227-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6b4e8d8797be4c0e91b1401538f2eba8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb818849-31", "ovs_interfaceid": "fb818849-31a0-4c25-b42d-ca19fe250ca6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f5496c5f-292e-4912-991b-f834009e51a1/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f5496c5f-292e-4912-991b-f834009e51a1/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Releasing lock "refresh_cache-f6d6085d-9e15-4e29-ab90-3a8928971324" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG nova.compute.manager [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Instance network_info: |[{"id": "fb818849-31a0-4c25-b42d-ca19fe250ca6", "address": "fa:16:3e:54:c5:ae", "network": {"id": "02aca424-2923-404b-9c66-76bec89f82b7", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1255258227-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6b4e8d8797be4c0e91b1401538f2eba8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb818849-31", "ovs_interfaceid": "fb818849-31a0-4c25-b42d-ca19fe250ca6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-e4b8e942-ddf5-47f5-a496-9dd3deae5bf3 req-8cf5a286-1a9d-4581-801a-f8d73e412c80 service nova] Acquired lock "refresh_cache-f6d6085d-9e15-4e29-ab90-3a8928971324" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG nova.network.neutron [req-e4b8e942-ddf5-47f5-a496-9dd3deae5bf3 req-8cf5a286-1a9d-4581-801a-f8d73e412c80 service nova] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Refreshing network info cache for port fb818849-31a0-4c25-b42d-ca19fe250ca6 {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Start _get_guest_xml network_info=[{"id": "fb818849-31a0-4c25-b42d-ca19fe250ca6", "address": "fa:16:3e:54:c5:ae", "network": {"id": "02aca424-2923-404b-9c66-76bec89f82b7", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1255258227-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6b4e8d8797be4c0e91b1401538f2eba8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb818849-31", "ovs_interfaceid": "fb818849-31a0-4c25-b42d-ca19fe250ca6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encrypted': False, 'device_type': 'disk', 'encryption_secret_uuid': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'b11a20de-f82a-4158-b53e-0a0c7a1552cb'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 18 16:20:03 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:20:03 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:20:03 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70975) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-18T16:11:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=), allow threads: True {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Flavor limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Image limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Flavor pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Image pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Got 1 possible topologies {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:19:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-2112657845',display_name='tempest-AttachVolumeNegativeTest-server-2112657845',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-2112657845',id=16,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHYRXU/ibSPY+lfyweoe12uOrvfmUvG6DlTq9LgRSH5Mu+rZpmKAfw8UVQNbDlibCQU69kF6sfr+Z42hzsCh/sT3mzfLZiHHLTZ94at32kiiHcYOGoL6apTKhxzZMUuP2A==',key_name='tempest-keypair-901960884',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6b4e8d8797be4c0e91b1401538f2eba8',ramdisk_id='',reservation_id='r-bu0v0mk3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-216357456',owner_user_name='tempest-AttachVolumeNegativeTest-216357456-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:19:59Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='af90e17ec027463fa8793e8539c39e13',uuid=f6d6085d-9e15-4e29-ab90-3a8928971324,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fb818849-31a0-4c25-b42d-ca19fe250ca6", "address": "fa:16:3e:54:c5:ae", "network": {"id": "02aca424-2923-404b-9c66-76bec89f82b7", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1255258227-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6b4e8d8797be4c0e91b1401538f2eba8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb818849-31", "ovs_interfaceid": "fb818849-31a0-4c25-b42d-ca19fe250ca6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70975) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Converting VIF {"id": "fb818849-31a0-4c25-b42d-ca19fe250ca6", "address": "fa:16:3e:54:c5:ae", "network": {"id": "02aca424-2923-404b-9c66-76bec89f82b7", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1255258227-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6b4e8d8797be4c0e91b1401538f2eba8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb818849-31", "ovs_interfaceid": "fb818849-31a0-4c25-b42d-ca19fe250ca6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:54:c5:ae,bridge_name='br-int',has_traffic_filtering=True,id=fb818849-31a0-4c25-b42d-ca19fe250ca6,network=Network(02aca424-2923-404b-9c66-76bec89f82b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb818849-31') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG nova.objects.instance [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lazy-loading 'pci_devices' on Instance uuid f6d6085d-9e15-4e29-ab90-3a8928971324 {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] End _get_guest_xml xml= Apr 18 16:20:03 user nova-compute[70975]: f6d6085d-9e15-4e29-ab90-3a8928971324 Apr 18 16:20:03 user nova-compute[70975]: instance-00000010 Apr 18 16:20:03 user nova-compute[70975]: 131072 Apr 18 16:20:03 user nova-compute[70975]: 1 Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: tempest-AttachVolumeNegativeTest-server-2112657845 Apr 18 16:20:03 user nova-compute[70975]: 2023-04-18 16:20:03 Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: 128 Apr 18 16:20:03 user nova-compute[70975]: 1 Apr 18 16:20:03 user nova-compute[70975]: 0 Apr 18 16:20:03 user nova-compute[70975]: 0 Apr 18 16:20:03 user nova-compute[70975]: 1 Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: tempest-AttachVolumeNegativeTest-216357456-project-member Apr 18 16:20:03 user nova-compute[70975]: tempest-AttachVolumeNegativeTest-216357456 Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: OpenStack Foundation Apr 18 16:20:03 user nova-compute[70975]: OpenStack Nova Apr 18 16:20:03 user nova-compute[70975]: 0.0.0 Apr 18 16:20:03 user nova-compute[70975]: f6d6085d-9e15-4e29-ab90-3a8928971324 Apr 18 16:20:03 user nova-compute[70975]: f6d6085d-9e15-4e29-ab90-3a8928971324 Apr 18 16:20:03 user nova-compute[70975]: Virtual Machine Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: hvm Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Nehalem Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: /dev/urandom Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: Apr 18 16:20:03 user nova-compute[70975]: {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:19:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-2112657845',display_name='tempest-AttachVolumeNegativeTest-server-2112657845',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-2112657845',id=16,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHYRXU/ibSPY+lfyweoe12uOrvfmUvG6DlTq9LgRSH5Mu+rZpmKAfw8UVQNbDlibCQU69kF6sfr+Z42hzsCh/sT3mzfLZiHHLTZ94at32kiiHcYOGoL6apTKhxzZMUuP2A==',key_name='tempest-keypair-901960884',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6b4e8d8797be4c0e91b1401538f2eba8',ramdisk_id='',reservation_id='r-bu0v0mk3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-216357456',owner_user_name='tempest-AttachVolumeNegativeTest-216357456-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:19:59Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='af90e17ec027463fa8793e8539c39e13',uuid=f6d6085d-9e15-4e29-ab90-3a8928971324,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fb818849-31a0-4c25-b42d-ca19fe250ca6", "address": "fa:16:3e:54:c5:ae", "network": {"id": "02aca424-2923-404b-9c66-76bec89f82b7", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1255258227-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6b4e8d8797be4c0e91b1401538f2eba8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb818849-31", "ovs_interfaceid": "fb818849-31a0-4c25-b42d-ca19fe250ca6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Converting VIF {"id": "fb818849-31a0-4c25-b42d-ca19fe250ca6", "address": "fa:16:3e:54:c5:ae", "network": {"id": "02aca424-2923-404b-9c66-76bec89f82b7", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1255258227-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6b4e8d8797be4c0e91b1401538f2eba8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb818849-31", "ovs_interfaceid": "fb818849-31a0-4c25-b42d-ca19fe250ca6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:54:c5:ae,bridge_name='br-int',has_traffic_filtering=True,id=fb818849-31a0-4c25-b42d-ca19fe250ca6,network=Network(02aca424-2923-404b-9c66-76bec89f82b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb818849-31') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG os_vif [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:c5:ae,bridge_name='br-int',has_traffic_filtering=True,id=fb818849-31a0-4c25-b42d-ca19fe250ca6,network=Network(02aca424-2923-404b-9c66-76bec89f82b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb818849-31') {{(pid=70975) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfb818849-31, may_exist=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfb818849-31, col_values=(('external_ids', {'iface-id': 'fb818849-31a0-4c25-b42d-ca19fe250ca6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:54:c5:ae', 'vm-uuid': 'f6d6085d-9e15-4e29-ab90-3a8928971324'}),)) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:03 user nova-compute[70975]: INFO os_vif [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:c5:ae,bridge_name='br-int',has_traffic_filtering=True,id=fb818849-31a0-4c25-b42d-ca19fe250ca6,network=Network(02aca424-2923-404b-9c66-76bec89f82b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb818849-31') Apr 18 16:20:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f5496c5f-292e-4912-991b-f834009e51a1/disk --force-share --output=json" returned: 0 in 0.163s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:20:03 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:20:03 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:20:03 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Hypervisor/Node resource view: name=user free_ram=8599MB free_disk=26.62527847290039GB free_vcpus=9 pci_devices=[{"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70975) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] No BDM found with device name vda, not building metadata. {{(pid=70975) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 18 16:20:03 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] No VIF found with MAC fa:16:3e:54:c5:ae, not building metadata {{(pid=70975) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 18 16:20:04 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:04 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 1b530349-680e-4def-86ef-29c340efa175 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:20:04 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 6528f05a-9f05-4f35-b991-687e4f47029e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:20:04 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance f5496c5f-292e-4912-991b-f834009e51a1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:20:04 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance f6d6085d-9e15-4e29-ab90-3a8928971324 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:20:04 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Total usable vcpus: 12, total allocated vcpus: 4 {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 18 16:20:04 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Final resource view: name=user phys_ram=16023MB used_ram=1024MB phys_disk=40GB used_disk=4GB total_vcpus=12 used_vcpus=4 pci_stats=[] {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 18 16:20:04 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Refreshing inventories for resource provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 18 16:20:04 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Updating ProviderTree inventory for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 18 16:20:04 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Updating inventory in ProviderTree for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 18 16:20:04 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Refreshing aggregate associations for resource provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9, aggregates: None {{(pid=70975) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 18 16:20:04 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Refreshing trait associations for resource provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9, traits: COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE41,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_RESCUE_BFV,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_IDE,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE42 {{(pid=70975) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 18 16:20:04 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:20:04 user nova-compute[70975]: DEBUG nova.network.neutron [req-e4b8e942-ddf5-47f5-a496-9dd3deae5bf3 req-8cf5a286-1a9d-4581-801a-f8d73e412c80 service nova] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Updated VIF entry in instance network info cache for port fb818849-31a0-4c25-b42d-ca19fe250ca6. {{(pid=70975) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 18 16:20:04 user nova-compute[70975]: DEBUG nova.network.neutron [req-e4b8e942-ddf5-47f5-a496-9dd3deae5bf3 req-8cf5a286-1a9d-4581-801a-f8d73e412c80 service nova] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Updating instance_info_cache with network_info: [{"id": "fb818849-31a0-4c25-b42d-ca19fe250ca6", "address": "fa:16:3e:54:c5:ae", "network": {"id": "02aca424-2923-404b-9c66-76bec89f82b7", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1255258227-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6b4e8d8797be4c0e91b1401538f2eba8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb818849-31", "ovs_interfaceid": "fb818849-31a0-4c25-b42d-ca19fe250ca6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:20:04 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:20:04 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-e4b8e942-ddf5-47f5-a496-9dd3deae5bf3 req-8cf5a286-1a9d-4581-801a-f8d73e412c80 service nova] Releasing lock "refresh_cache-f6d6085d-9e15-4e29-ab90-3a8928971324" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:20:04 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Compute_service record updated for user:user {{(pid=70975) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 18 16:20:04 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.603s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:20:05 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:05 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:05 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:05 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:20:05 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:20:05 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:20:05 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:20:05 user nova-compute[70975]: DEBUG nova.compute.manager [req-4b230ccc-ec38-4621-a32c-90f199d9cf9a req-7bef9ee3-547c-47ce-9117-7fa348a6320b service nova] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Received event network-vif-plugged-fb818849-31a0-4c25-b42d-ca19fe250ca6 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:20:05 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-4b230ccc-ec38-4621-a32c-90f199d9cf9a req-7bef9ee3-547c-47ce-9117-7fa348a6320b service nova] Acquiring lock "f6d6085d-9e15-4e29-ab90-3a8928971324-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:20:05 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-4b230ccc-ec38-4621-a32c-90f199d9cf9a req-7bef9ee3-547c-47ce-9117-7fa348a6320b service nova] Lock "f6d6085d-9e15-4e29-ab90-3a8928971324-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:20:05 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-4b230ccc-ec38-4621-a32c-90f199d9cf9a req-7bef9ee3-547c-47ce-9117-7fa348a6320b service nova] Lock "f6d6085d-9e15-4e29-ab90-3a8928971324-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:20:05 user nova-compute[70975]: DEBUG nova.compute.manager [req-4b230ccc-ec38-4621-a32c-90f199d9cf9a req-7bef9ee3-547c-47ce-9117-7fa348a6320b service nova] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] No waiting events found dispatching network-vif-plugged-fb818849-31a0-4c25-b42d-ca19fe250ca6 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:20:05 user nova-compute[70975]: WARNING nova.compute.manager [req-4b230ccc-ec38-4621-a32c-90f199d9cf9a req-7bef9ee3-547c-47ce-9117-7fa348a6320b service nova] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Received unexpected event network-vif-plugged-fb818849-31a0-4c25-b42d-ca19fe250ca6 for instance with vm_state building and task_state spawning. Apr 18 16:20:05 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:05 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:05 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:05 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:06 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Acquiring lock "5f4e6f9b-5413-4399-83ca-9bc78911db38" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:20:06 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Lock "5f4e6f9b-5413-4399-83ca-9bc78911db38" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:20:06 user nova-compute[70975]: DEBUG nova.compute.manager [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Starting instance... {{(pid=70975) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70975) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 18 16:20:07 user nova-compute[70975]: INFO nova.compute.claims [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Claim successful on node user Apr 18 16:20:07 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.287s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG nova.compute.manager [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Start building networks asynchronously for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG nova.compute.manager [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Allocating IP information in the background. {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG nova.network.neutron [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] allocate_for_instance() {{(pid=70975) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 18 16:20:07 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 18 16:20:07 user nova-compute[70975]: DEBUG nova.compute.manager [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Start building block device mappings for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Resumed> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:20:07 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] VM Resumed (Lifecycle Event) Apr 18 16:20:07 user nova-compute[70975]: DEBUG nova.compute.manager [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Instance event wait completed in 0 seconds for {{(pid=70975) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Guest created on hypervisor {{(pid=70975) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 18 16:20:07 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Instance spawned successfully. Apr 18 16:20:07 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Found default for hw_cdrom_bus of ide {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Found default for hw_disk_bus of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Found default for hw_input_bus of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Found default for hw_pointer_model of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Found default for hw_video_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Found default for hw_vif_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG nova.policy [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b01da22d0c8c4ee580567646e279d7b9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '000fb9c948224fe3b595882d36cfb859', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70975) authorize /opt/stack/nova/nova/policy.py:203}} Apr 18 16:20:07 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:20:07 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Started> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:20:07 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] VM Started (Lifecycle Event) Apr 18 16:20:07 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG nova.compute.manager [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Start spawning the instance on the hypervisor. {{(pid=70975) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Creating instance directory {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 18 16:20:07 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Creating image(s) Apr 18 16:20:07 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Acquiring lock "/opt/stack/data/nova/instances/5f4e6f9b-5413-4399-83ca-9bc78911db38/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Lock "/opt/stack/data/nova/instances/5f4e6f9b-5413-4399-83ca-9bc78911db38/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Lock "/opt/stack/data/nova/instances/5f4e6f9b-5413-4399-83ca-9bc78911db38/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.009s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:20:07 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:20:07 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:20:07 user nova-compute[70975]: INFO nova.compute.manager [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Took 8.62 seconds to spawn the instance on the hypervisor. Apr 18 16:20:07 user nova-compute[70975]: DEBUG nova.compute.manager [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.133s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Acquiring lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG nova.compute.manager [req-5f2ff3e4-4e6c-45dc-a6c3-94bd4c561904 req-6e90d571-6bd5-4077-9e08-646433ef41fb service nova] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Received event network-vif-plugged-fb818849-31a0-4c25-b42d-ca19fe250ca6 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-5f2ff3e4-4e6c-45dc-a6c3-94bd4c561904 req-6e90d571-6bd5-4077-9e08-646433ef41fb service nova] Acquiring lock "f6d6085d-9e15-4e29-ab90-3a8928971324-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-5f2ff3e4-4e6c-45dc-a6c3-94bd4c561904 req-6e90d571-6bd5-4077-9e08-646433ef41fb service nova] Lock "f6d6085d-9e15-4e29-ab90-3a8928971324-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-5f2ff3e4-4e6c-45dc-a6c3-94bd4c561904 req-6e90d571-6bd5-4077-9e08-646433ef41fb service nova] Lock "f6d6085d-9e15-4e29-ab90-3a8928971324-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG nova.compute.manager [req-5f2ff3e4-4e6c-45dc-a6c3-94bd4c561904 req-6e90d571-6bd5-4077-9e08-646433ef41fb service nova] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] No waiting events found dispatching network-vif-plugged-fb818849-31a0-4c25-b42d-ca19fe250ca6 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:20:07 user nova-compute[70975]: WARNING nova.compute.manager [req-5f2ff3e4-4e6c-45dc-a6c3-94bd4c561904 req-6e90d571-6bd5-4077-9e08-646433ef41fb service nova] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Received unexpected event network-vif-plugged-fb818849-31a0-4c25-b42d-ca19fe250ca6 for instance with vm_state building and task_state spawning. Apr 18 16:20:07 user nova-compute[70975]: INFO nova.compute.manager [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Took 9.32 seconds to build instance. Apr 18 16:20:07 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-01e86ce5-2b6e-4db5-9dd3-c08b693ca307 tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "f6d6085d-9e15-4e29-ab90-3a8928971324" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 9.432s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.151s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/5f4e6f9b-5413-4399-83ca-9bc78911db38/disk 1073741824 {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/5f4e6f9b-5413-4399-83ca-9bc78911db38/disk 1073741824" returned: 0 in 0.048s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.204s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:20:07 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:20:08 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.138s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:20:08 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Checking if we can resize image /opt/stack/data/nova/instances/5f4e6f9b-5413-4399-83ca-9bc78911db38/disk. size=1073741824 {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 18 16:20:08 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5f4e6f9b-5413-4399-83ca-9bc78911db38/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:20:08 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5f4e6f9b-5413-4399-83ca-9bc78911db38/disk --force-share --output=json" returned: 0 in 0.149s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:20:08 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Cannot resize image /opt/stack/data/nova/instances/5f4e6f9b-5413-4399-83ca-9bc78911db38/disk to a smaller size. {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 18 16:20:08 user nova-compute[70975]: DEBUG nova.objects.instance [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Lazy-loading 'migration_context' on Instance uuid 5f4e6f9b-5413-4399-83ca-9bc78911db38 {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:20:08 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Created local disks {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 18 16:20:08 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Ensure instance console log exists: /opt/stack/data/nova/instances/5f4e6f9b-5413-4399-83ca-9bc78911db38/console.log {{(pid=70975) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 18 16:20:08 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:20:08 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:20:08 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:20:08 user nova-compute[70975]: DEBUG nova.network.neutron [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Successfully created port: 4330b5f4-c990-4a7d-aa5f-f95315fddf78 {{(pid=70975) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 18 16:20:08 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:08 user nova-compute[70975]: DEBUG nova.network.neutron [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Successfully updated port: 4330b5f4-c990-4a7d-aa5f-f95315fddf78 {{(pid=70975) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 18 16:20:08 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Acquiring lock "refresh_cache-5f4e6f9b-5413-4399-83ca-9bc78911db38" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:20:08 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Acquired lock "refresh_cache-5f4e6f9b-5413-4399-83ca-9bc78911db38" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:20:08 user nova-compute[70975]: DEBUG nova.network.neutron [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Building network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 18 16:20:08 user nova-compute[70975]: DEBUG nova.compute.manager [req-6cdbf5de-6b5a-4167-8801-73edc86728dc req-257c65fd-02aa-4bab-b67f-eac619dd4e01 service nova] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Received event network-changed-4330b5f4-c990-4a7d-aa5f-f95315fddf78 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:20:08 user nova-compute[70975]: DEBUG nova.compute.manager [req-6cdbf5de-6b5a-4167-8801-73edc86728dc req-257c65fd-02aa-4bab-b67f-eac619dd4e01 service nova] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Refreshing instance network info cache due to event network-changed-4330b5f4-c990-4a7d-aa5f-f95315fddf78. {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 18 16:20:08 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-6cdbf5de-6b5a-4167-8801-73edc86728dc req-257c65fd-02aa-4bab-b67f-eac619dd4e01 service nova] Acquiring lock "refresh_cache-5f4e6f9b-5413-4399-83ca-9bc78911db38" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:20:09 user nova-compute[70975]: DEBUG nova.network.neutron [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Instance cache missing network info. {{(pid=70975) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 18 16:20:09 user nova-compute[70975]: DEBUG nova.network.neutron [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Updating instance_info_cache with network_info: [{"id": "4330b5f4-c990-4a7d-aa5f-f95315fddf78", "address": "fa:16:3e:6f:03:b0", "network": {"id": "c17e54e4-8788-475c-b342-65dddd71342f", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-2093298384-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "000fb9c948224fe3b595882d36cfb859", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4330b5f4-c9", "ovs_interfaceid": "4330b5f4-c990-4a7d-aa5f-f95315fddf78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:20:09 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Releasing lock "refresh_cache-5f4e6f9b-5413-4399-83ca-9bc78911db38" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:20:09 user nova-compute[70975]: DEBUG nova.compute.manager [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Instance network_info: |[{"id": "4330b5f4-c990-4a7d-aa5f-f95315fddf78", "address": "fa:16:3e:6f:03:b0", "network": {"id": "c17e54e4-8788-475c-b342-65dddd71342f", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-2093298384-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "000fb9c948224fe3b595882d36cfb859", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4330b5f4-c9", "ovs_interfaceid": "4330b5f4-c990-4a7d-aa5f-f95315fddf78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 18 16:20:09 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-6cdbf5de-6b5a-4167-8801-73edc86728dc req-257c65fd-02aa-4bab-b67f-eac619dd4e01 service nova] Acquired lock "refresh_cache-5f4e6f9b-5413-4399-83ca-9bc78911db38" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:20:09 user nova-compute[70975]: DEBUG nova.network.neutron [req-6cdbf5de-6b5a-4167-8801-73edc86728dc req-257c65fd-02aa-4bab-b67f-eac619dd4e01 service nova] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Refreshing network info cache for port 4330b5f4-c990-4a7d-aa5f-f95315fddf78 {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 18 16:20:09 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Start _get_guest_xml network_info=[{"id": "4330b5f4-c990-4a7d-aa5f-f95315fddf78", "address": "fa:16:3e:6f:03:b0", "network": {"id": "c17e54e4-8788-475c-b342-65dddd71342f", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-2093298384-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "000fb9c948224fe3b595882d36cfb859", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4330b5f4-c9", "ovs_interfaceid": "4330b5f4-c990-4a7d-aa5f-f95315fddf78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encrypted': False, 'device_type': 'disk', 'encryption_secret_uuid': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'b11a20de-f82a-4158-b53e-0a0c7a1552cb'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 18 16:20:09 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:20:09 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:20:09 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70975) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 18 16:20:09 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-18T16:11:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=), allow threads: True {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 18 16:20:09 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Flavor limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 18 16:20:09 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Image limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 18 16:20:09 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Flavor pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 18 16:20:09 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Image pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 18 16:20:09 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 18 16:20:09 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 18 16:20:09 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 18 16:20:09 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Got 1 possible topologies {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 18 16:20:09 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 18 16:20:09 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 18 16:20:09 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:20:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SnapshotDataIntegrityTests-server-751902854',display_name='tempest-SnapshotDataIntegrityTests-server-751902854',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-snapshotdataintegritytests-server-751902854',id=17,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEjB7c1t/30/n9+kA02kID4xCptJhD3sPRt7RADFNBND5ODgA1T9Tp0ttLsMngsvRYW4lRZy7x7r7VmJz6i3Dd8HWEtRffPcfiU0xC7owaaIepRkCkRUzrPyKKptrcjx0A==',key_name='tempest-SnapshotDataIntegrityTests-898730089',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='000fb9c948224fe3b595882d36cfb859',ramdisk_id='',reservation_id='r-1fmw5k8n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-SnapshotDataIntegrityTests-155672580',owner_user_name='tempest-SnapshotDataIntegrityTests-155672580-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:20:07Z,user_data=None,user_id='b01da22d0c8c4ee580567646e279d7b9',uuid=5f4e6f9b-5413-4399-83ca-9bc78911db38,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4330b5f4-c990-4a7d-aa5f-f95315fddf78", "address": "fa:16:3e:6f:03:b0", "network": {"id": "c17e54e4-8788-475c-b342-65dddd71342f", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-2093298384-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "000fb9c948224fe3b595882d36cfb859", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4330b5f4-c9", "ovs_interfaceid": "4330b5f4-c990-4a7d-aa5f-f95315fddf78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70975) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 18 16:20:09 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Converting VIF {"id": "4330b5f4-c990-4a7d-aa5f-f95315fddf78", "address": "fa:16:3e:6f:03:b0", "network": {"id": "c17e54e4-8788-475c-b342-65dddd71342f", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-2093298384-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "000fb9c948224fe3b595882d36cfb859", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4330b5f4-c9", "ovs_interfaceid": "4330b5f4-c990-4a7d-aa5f-f95315fddf78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:20:09 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6f:03:b0,bridge_name='br-int',has_traffic_filtering=True,id=4330b5f4-c990-4a7d-aa5f-f95315fddf78,network=Network(c17e54e4-8788-475c-b342-65dddd71342f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4330b5f4-c9') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:20:09 user nova-compute[70975]: DEBUG nova.objects.instance [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Lazy-loading 'pci_devices' on Instance uuid 5f4e6f9b-5413-4399-83ca-9bc78911db38 {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:20:09 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] End _get_guest_xml xml= Apr 18 16:20:09 user nova-compute[70975]: 5f4e6f9b-5413-4399-83ca-9bc78911db38 Apr 18 16:20:09 user nova-compute[70975]: instance-00000011 Apr 18 16:20:09 user nova-compute[70975]: 131072 Apr 18 16:20:09 user nova-compute[70975]: 1 Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: tempest-SnapshotDataIntegrityTests-server-751902854 Apr 18 16:20:09 user nova-compute[70975]: 2023-04-18 16:20:09 Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: 128 Apr 18 16:20:09 user nova-compute[70975]: 1 Apr 18 16:20:09 user nova-compute[70975]: 0 Apr 18 16:20:09 user nova-compute[70975]: 0 Apr 18 16:20:09 user nova-compute[70975]: 1 Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: tempest-SnapshotDataIntegrityTests-155672580-project-member Apr 18 16:20:09 user nova-compute[70975]: tempest-SnapshotDataIntegrityTests-155672580 Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: OpenStack Foundation Apr 18 16:20:09 user nova-compute[70975]: OpenStack Nova Apr 18 16:20:09 user nova-compute[70975]: 0.0.0 Apr 18 16:20:09 user nova-compute[70975]: 5f4e6f9b-5413-4399-83ca-9bc78911db38 Apr 18 16:20:09 user nova-compute[70975]: 5f4e6f9b-5413-4399-83ca-9bc78911db38 Apr 18 16:20:09 user nova-compute[70975]: Virtual Machine Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: hvm Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Nehalem Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: /dev/urandom Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: Apr 18 16:20:09 user nova-compute[70975]: {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 18 16:20:09 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:20:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SnapshotDataIntegrityTests-server-751902854',display_name='tempest-SnapshotDataIntegrityTests-server-751902854',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-snapshotdataintegritytests-server-751902854',id=17,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEjB7c1t/30/n9+kA02kID4xCptJhD3sPRt7RADFNBND5ODgA1T9Tp0ttLsMngsvRYW4lRZy7x7r7VmJz6i3Dd8HWEtRffPcfiU0xC7owaaIepRkCkRUzrPyKKptrcjx0A==',key_name='tempest-SnapshotDataIntegrityTests-898730089',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='000fb9c948224fe3b595882d36cfb859',ramdisk_id='',reservation_id='r-1fmw5k8n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-SnapshotDataIntegrityTests-155672580',owner_user_name='tempest-SnapshotDataIntegrityTests-155672580-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:20:07Z,user_data=None,user_id='b01da22d0c8c4ee580567646e279d7b9',uuid=5f4e6f9b-5413-4399-83ca-9bc78911db38,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4330b5f4-c990-4a7d-aa5f-f95315fddf78", "address": "fa:16:3e:6f:03:b0", "network": {"id": "c17e54e4-8788-475c-b342-65dddd71342f", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-2093298384-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "000fb9c948224fe3b595882d36cfb859", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4330b5f4-c9", "ovs_interfaceid": "4330b5f4-c990-4a7d-aa5f-f95315fddf78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 18 16:20:09 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Converting VIF {"id": "4330b5f4-c990-4a7d-aa5f-f95315fddf78", "address": "fa:16:3e:6f:03:b0", "network": {"id": "c17e54e4-8788-475c-b342-65dddd71342f", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-2093298384-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "000fb9c948224fe3b595882d36cfb859", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4330b5f4-c9", "ovs_interfaceid": "4330b5f4-c990-4a7d-aa5f-f95315fddf78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:20:09 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6f:03:b0,bridge_name='br-int',has_traffic_filtering=True,id=4330b5f4-c990-4a7d-aa5f-f95315fddf78,network=Network(c17e54e4-8788-475c-b342-65dddd71342f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4330b5f4-c9') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:20:09 user nova-compute[70975]: DEBUG os_vif [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6f:03:b0,bridge_name='br-int',has_traffic_filtering=True,id=4330b5f4-c990-4a7d-aa5f-f95315fddf78,network=Network(c17e54e4-8788-475c-b342-65dddd71342f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4330b5f4-c9') {{(pid=70975) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 18 16:20:09 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:09 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:20:09 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 18 16:20:09 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:09 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4330b5f4-c9, may_exist=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:20:09 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4330b5f4-c9, col_values=(('external_ids', {'iface-id': '4330b5f4-c990-4a7d-aa5f-f95315fddf78', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6f:03:b0', 'vm-uuid': '5f4e6f9b-5413-4399-83ca-9bc78911db38'}),)) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:20:09 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:09 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:20:09 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:09 user nova-compute[70975]: INFO os_vif [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6f:03:b0,bridge_name='br-int',has_traffic_filtering=True,id=4330b5f4-c990-4a7d-aa5f-f95315fddf78,network=Network(c17e54e4-8788-475c-b342-65dddd71342f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4330b5f4-c9') Apr 18 16:20:09 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] No BDM found with device name vda, not building metadata. {{(pid=70975) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 18 16:20:09 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] No VIF found with MAC fa:16:3e:6f:03:b0, not building metadata {{(pid=70975) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 18 16:20:10 user nova-compute[70975]: DEBUG nova.network.neutron [req-6cdbf5de-6b5a-4167-8801-73edc86728dc req-257c65fd-02aa-4bab-b67f-eac619dd4e01 service nova] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Updated VIF entry in instance network info cache for port 4330b5f4-c990-4a7d-aa5f-f95315fddf78. {{(pid=70975) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 18 16:20:10 user nova-compute[70975]: DEBUG nova.network.neutron [req-6cdbf5de-6b5a-4167-8801-73edc86728dc req-257c65fd-02aa-4bab-b67f-eac619dd4e01 service nova] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Updating instance_info_cache with network_info: [{"id": "4330b5f4-c990-4a7d-aa5f-f95315fddf78", "address": "fa:16:3e:6f:03:b0", "network": {"id": "c17e54e4-8788-475c-b342-65dddd71342f", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-2093298384-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "000fb9c948224fe3b595882d36cfb859", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4330b5f4-c9", "ovs_interfaceid": "4330b5f4-c990-4a7d-aa5f-f95315fddf78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:20:10 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-6cdbf5de-6b5a-4167-8801-73edc86728dc req-257c65fd-02aa-4bab-b67f-eac619dd4e01 service nova] Releasing lock "refresh_cache-5f4e6f9b-5413-4399-83ca-9bc78911db38" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:20:11 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:11 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:11 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:11 user nova-compute[70975]: DEBUG nova.compute.manager [req-93049643-9c0c-491a-93bb-88c022f122c6 req-6dd305a5-a413-4f01-8b69-8e2055a1a2c1 service nova] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Received event network-vif-plugged-4330b5f4-c990-4a7d-aa5f-f95315fddf78 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:20:11 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-93049643-9c0c-491a-93bb-88c022f122c6 req-6dd305a5-a413-4f01-8b69-8e2055a1a2c1 service nova] Acquiring lock "5f4e6f9b-5413-4399-83ca-9bc78911db38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:20:11 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-93049643-9c0c-491a-93bb-88c022f122c6 req-6dd305a5-a413-4f01-8b69-8e2055a1a2c1 service nova] Lock "5f4e6f9b-5413-4399-83ca-9bc78911db38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:20:11 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-93049643-9c0c-491a-93bb-88c022f122c6 req-6dd305a5-a413-4f01-8b69-8e2055a1a2c1 service nova] Lock "5f4e6f9b-5413-4399-83ca-9bc78911db38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:20:11 user nova-compute[70975]: DEBUG nova.compute.manager [req-93049643-9c0c-491a-93bb-88c022f122c6 req-6dd305a5-a413-4f01-8b69-8e2055a1a2c1 service nova] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] No waiting events found dispatching network-vif-plugged-4330b5f4-c990-4a7d-aa5f-f95315fddf78 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:20:11 user nova-compute[70975]: WARNING nova.compute.manager [req-93049643-9c0c-491a-93bb-88c022f122c6 req-6dd305a5-a413-4f01-8b69-8e2055a1a2c1 service nova] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Received unexpected event network-vif-plugged-4330b5f4-c990-4a7d-aa5f-f95315fddf78 for instance with vm_state building and task_state spawning. Apr 18 16:20:11 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:11 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:12 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:13 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Resumed> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:20:13 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] VM Resumed (Lifecycle Event) Apr 18 16:20:13 user nova-compute[70975]: DEBUG nova.compute.manager [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Instance event wait completed in 0 seconds for {{(pid=70975) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 18 16:20:13 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Guest created on hypervisor {{(pid=70975) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 18 16:20:13 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Instance spawned successfully. Apr 18 16:20:13 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 18 16:20:13 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:20:13 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:20:13 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Found default for hw_cdrom_bus of ide {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:20:13 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Found default for hw_disk_bus of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:20:13 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Found default for hw_input_bus of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:20:13 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Found default for hw_pointer_model of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:20:13 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Found default for hw_video_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:20:13 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Found default for hw_vif_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:20:13 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:20:13 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Started> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:20:13 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] VM Started (Lifecycle Event) Apr 18 16:20:13 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:20:13 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:20:13 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:20:13 user nova-compute[70975]: INFO nova.compute.manager [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Took 5.80 seconds to spawn the instance on the hypervisor. Apr 18 16:20:13 user nova-compute[70975]: DEBUG nova.compute.manager [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:20:13 user nova-compute[70975]: INFO nova.compute.manager [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Took 6.43 seconds to build instance. Apr 18 16:20:13 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-d693e0ca-58b5-43b5-90ed-a15fd0c0cf0f tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Lock "5f4e6f9b-5413-4399-83ca-9bc78911db38" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.535s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:20:13 user nova-compute[70975]: DEBUG nova.compute.manager [req-13235056-56e9-4221-89f1-7ac3f1e53f28 req-55c0cd1b-eb38-4ee0-b33e-413e1ff75caa service nova] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Received event network-vif-plugged-4330b5f4-c990-4a7d-aa5f-f95315fddf78 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:20:13 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-13235056-56e9-4221-89f1-7ac3f1e53f28 req-55c0cd1b-eb38-4ee0-b33e-413e1ff75caa service nova] Acquiring lock "5f4e6f9b-5413-4399-83ca-9bc78911db38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:20:13 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-13235056-56e9-4221-89f1-7ac3f1e53f28 req-55c0cd1b-eb38-4ee0-b33e-413e1ff75caa service nova] Lock "5f4e6f9b-5413-4399-83ca-9bc78911db38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:20:13 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-13235056-56e9-4221-89f1-7ac3f1e53f28 req-55c0cd1b-eb38-4ee0-b33e-413e1ff75caa service nova] Lock "5f4e6f9b-5413-4399-83ca-9bc78911db38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:20:13 user nova-compute[70975]: DEBUG nova.compute.manager [req-13235056-56e9-4221-89f1-7ac3f1e53f28 req-55c0cd1b-eb38-4ee0-b33e-413e1ff75caa service nova] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] No waiting events found dispatching network-vif-plugged-4330b5f4-c990-4a7d-aa5f-f95315fddf78 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:20:13 user nova-compute[70975]: WARNING nova.compute.manager [req-13235056-56e9-4221-89f1-7ac3f1e53f28 req-55c0cd1b-eb38-4ee0-b33e-413e1ff75caa service nova] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Received unexpected event network-vif-plugged-4330b5f4-c990-4a7d-aa5f-f95315fddf78 for instance with vm_state active and task_state None. Apr 18 16:20:14 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:14 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:16 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:19 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:22 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:24 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:26 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:27 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:29 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:32 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:34 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:34 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:37 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:39 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:39 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:42 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Acquiring lock "c16a352d-3f0c-4688-a890-81be1fee9f35" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:20:42 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Lock "c16a352d-3f0c-4688-a890-81be1fee9f35" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:20:42 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Starting instance... {{(pid=70975) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 18 16:20:42 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:20:42 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:20:42 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70975) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 18 16:20:42 user nova-compute[70975]: INFO nova.compute.claims [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Claim successful on node user Apr 18 16:20:42 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:42 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:20:42 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:20:42 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.372s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:20:42 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Start building networks asynchronously for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 18 16:20:42 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Allocating IP information in the background. {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 18 16:20:42 user nova-compute[70975]: DEBUG nova.network.neutron [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] allocate_for_instance() {{(pid=70975) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 18 16:20:42 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 18 16:20:42 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Start building block device mappings for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 18 16:20:42 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Start spawning the instance on the hypervisor. {{(pid=70975) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 18 16:20:42 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Creating instance directory {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 18 16:20:42 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Creating image(s) Apr 18 16:20:42 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Acquiring lock "/opt/stack/data/nova/instances/c16a352d-3f0c-4688-a890-81be1fee9f35/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:20:42 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Lock "/opt/stack/data/nova/instances/c16a352d-3f0c-4688-a890-81be1fee9f35/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:20:42 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Lock "/opt/stack/data/nova/instances/c16a352d-3f0c-4688-a890-81be1fee9f35/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:20:42 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:20:42 user nova-compute[70975]: DEBUG nova.policy [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3e222f73f9194870b4fac305cdd60f3a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fb49d5ddf6db4b1d807ba42fd37a919d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70975) authorize /opt/stack/nova/nova/policy.py:203}} Apr 18 16:20:42 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.132s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:20:42 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Acquiring lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:20:42 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:20:42 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:20:43 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.142s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:20:43 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/c16a352d-3f0c-4688-a890-81be1fee9f35/disk 1073741824 {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:20:43 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/c16a352d-3f0c-4688-a890-81be1fee9f35/disk 1073741824" returned: 0 in 0.048s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:20:43 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.196s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:20:43 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:20:43 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.144s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:20:43 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Checking if we can resize image /opt/stack/data/nova/instances/c16a352d-3f0c-4688-a890-81be1fee9f35/disk. size=1073741824 {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 18 16:20:43 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c16a352d-3f0c-4688-a890-81be1fee9f35/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:20:43 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c16a352d-3f0c-4688-a890-81be1fee9f35/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:20:43 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Cannot resize image /opt/stack/data/nova/instances/c16a352d-3f0c-4688-a890-81be1fee9f35/disk to a smaller size. {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 18 16:20:43 user nova-compute[70975]: DEBUG nova.objects.instance [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Lazy-loading 'migration_context' on Instance uuid c16a352d-3f0c-4688-a890-81be1fee9f35 {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:20:43 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Created local disks {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 18 16:20:43 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Ensure instance console log exists: /opt/stack/data/nova/instances/c16a352d-3f0c-4688-a890-81be1fee9f35/console.log {{(pid=70975) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 18 16:20:43 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:20:43 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:20:43 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:20:43 user nova-compute[70975]: DEBUG nova.network.neutron [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Successfully created port: 159a1d71-92e3-4ac2-ad79-f530a49580e9 {{(pid=70975) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 18 16:20:44 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:44 user nova-compute[70975]: DEBUG nova.network.neutron [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Successfully updated port: 159a1d71-92e3-4ac2-ad79-f530a49580e9 {{(pid=70975) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 18 16:20:44 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Acquiring lock "refresh_cache-c16a352d-3f0c-4688-a890-81be1fee9f35" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:20:44 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Acquired lock "refresh_cache-c16a352d-3f0c-4688-a890-81be1fee9f35" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:20:44 user nova-compute[70975]: DEBUG nova.network.neutron [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Building network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 18 16:20:44 user nova-compute[70975]: DEBUG nova.compute.manager [req-063209df-1df0-44d8-99a6-bb9ecef9bc20 req-d254d26b-a11c-4af0-8e18-387fb6af96f3 service nova] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Received event network-changed-159a1d71-92e3-4ac2-ad79-f530a49580e9 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:20:44 user nova-compute[70975]: DEBUG nova.compute.manager [req-063209df-1df0-44d8-99a6-bb9ecef9bc20 req-d254d26b-a11c-4af0-8e18-387fb6af96f3 service nova] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Refreshing instance network info cache due to event network-changed-159a1d71-92e3-4ac2-ad79-f530a49580e9. {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 18 16:20:44 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-063209df-1df0-44d8-99a6-bb9ecef9bc20 req-d254d26b-a11c-4af0-8e18-387fb6af96f3 service nova] Acquiring lock "refresh_cache-c16a352d-3f0c-4688-a890-81be1fee9f35" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:20:44 user nova-compute[70975]: DEBUG nova.network.neutron [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Instance cache missing network info. {{(pid=70975) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 18 16:20:45 user nova-compute[70975]: DEBUG nova.network.neutron [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Updating instance_info_cache with network_info: [{"id": "159a1d71-92e3-4ac2-ad79-f530a49580e9", "address": "fa:16:3e:f9:70:73", "network": {"id": "a4882813-5e0c-44e4-b47a-bc4b69fbead5", "bridge": "br-int", "label": "tempest-VolumesActionsTest-2022397269-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "fb49d5ddf6db4b1d807ba42fd37a919d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap159a1d71-92", "ovs_interfaceid": "159a1d71-92e3-4ac2-ad79-f530a49580e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:20:45 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Releasing lock "refresh_cache-c16a352d-3f0c-4688-a890-81be1fee9f35" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:20:45 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Instance network_info: |[{"id": "159a1d71-92e3-4ac2-ad79-f530a49580e9", "address": "fa:16:3e:f9:70:73", "network": {"id": "a4882813-5e0c-44e4-b47a-bc4b69fbead5", "bridge": "br-int", "label": "tempest-VolumesActionsTest-2022397269-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "fb49d5ddf6db4b1d807ba42fd37a919d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap159a1d71-92", "ovs_interfaceid": "159a1d71-92e3-4ac2-ad79-f530a49580e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 18 16:20:45 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-063209df-1df0-44d8-99a6-bb9ecef9bc20 req-d254d26b-a11c-4af0-8e18-387fb6af96f3 service nova] Acquired lock "refresh_cache-c16a352d-3f0c-4688-a890-81be1fee9f35" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:20:45 user nova-compute[70975]: DEBUG nova.network.neutron [req-063209df-1df0-44d8-99a6-bb9ecef9bc20 req-d254d26b-a11c-4af0-8e18-387fb6af96f3 service nova] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Refreshing network info cache for port 159a1d71-92e3-4ac2-ad79-f530a49580e9 {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 18 16:20:45 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Start _get_guest_xml network_info=[{"id": "159a1d71-92e3-4ac2-ad79-f530a49580e9", "address": "fa:16:3e:f9:70:73", "network": {"id": "a4882813-5e0c-44e4-b47a-bc4b69fbead5", "bridge": "br-int", "label": "tempest-VolumesActionsTest-2022397269-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "fb49d5ddf6db4b1d807ba42fd37a919d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap159a1d71-92", "ovs_interfaceid": "159a1d71-92e3-4ac2-ad79-f530a49580e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encrypted': False, 'device_type': 'disk', 'encryption_secret_uuid': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'b11a20de-f82a-4158-b53e-0a0c7a1552cb'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 18 16:20:45 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:20:45 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:20:45 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70975) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 18 16:20:45 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-18T16:11:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=), allow threads: True {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 18 16:20:45 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Flavor limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 18 16:20:45 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Image limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 18 16:20:45 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Flavor pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 18 16:20:45 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Image pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 18 16:20:45 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 18 16:20:45 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 18 16:20:45 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 18 16:20:45 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Got 1 possible topologies {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 18 16:20:45 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 18 16:20:45 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 18 16:20:45 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:20:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-1247400120',display_name='tempest-VolumesActionsTest-instance-1247400120',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesactionstest-instance-1247400120',id=18,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fb49d5ddf6db4b1d807ba42fd37a919d',ramdisk_id='',reservation_id='r-2ruy1cgq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesActionsTest-1956026439',owner_user_name='tempest-VolumesActionsTest-1956026439-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:20:43Z,user_data=None,user_id='3e222f73f9194870b4fac305cdd60f3a',uuid=c16a352d-3f0c-4688-a890-81be1fee9f35,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "159a1d71-92e3-4ac2-ad79-f530a49580e9", "address": "fa:16:3e:f9:70:73", "network": {"id": "a4882813-5e0c-44e4-b47a-bc4b69fbead5", "bridge": "br-int", "label": "tempest-VolumesActionsTest-2022397269-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "fb49d5ddf6db4b1d807ba42fd37a919d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap159a1d71-92", "ovs_interfaceid": "159a1d71-92e3-4ac2-ad79-f530a49580e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70975) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 18 16:20:45 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Converting VIF {"id": "159a1d71-92e3-4ac2-ad79-f530a49580e9", "address": "fa:16:3e:f9:70:73", "network": {"id": "a4882813-5e0c-44e4-b47a-bc4b69fbead5", "bridge": "br-int", "label": "tempest-VolumesActionsTest-2022397269-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "fb49d5ddf6db4b1d807ba42fd37a919d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap159a1d71-92", "ovs_interfaceid": "159a1d71-92e3-4ac2-ad79-f530a49580e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:20:45 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:70:73,bridge_name='br-int',has_traffic_filtering=True,id=159a1d71-92e3-4ac2-ad79-f530a49580e9,network=Network(a4882813-5e0c-44e4-b47a-bc4b69fbead5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap159a1d71-92') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:20:45 user nova-compute[70975]: DEBUG nova.objects.instance [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Lazy-loading 'pci_devices' on Instance uuid c16a352d-3f0c-4688-a890-81be1fee9f35 {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:20:45 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] End _get_guest_xml xml= Apr 18 16:20:45 user nova-compute[70975]: c16a352d-3f0c-4688-a890-81be1fee9f35 Apr 18 16:20:45 user nova-compute[70975]: instance-00000012 Apr 18 16:20:45 user nova-compute[70975]: 131072 Apr 18 16:20:45 user nova-compute[70975]: 1 Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: tempest-VolumesActionsTest-instance-1247400120 Apr 18 16:20:45 user nova-compute[70975]: 2023-04-18 16:20:45 Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: 128 Apr 18 16:20:45 user nova-compute[70975]: 1 Apr 18 16:20:45 user nova-compute[70975]: 0 Apr 18 16:20:45 user nova-compute[70975]: 0 Apr 18 16:20:45 user nova-compute[70975]: 1 Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: tempest-VolumesActionsTest-1956026439-project-member Apr 18 16:20:45 user nova-compute[70975]: tempest-VolumesActionsTest-1956026439 Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: OpenStack Foundation Apr 18 16:20:45 user nova-compute[70975]: OpenStack Nova Apr 18 16:20:45 user nova-compute[70975]: 0.0.0 Apr 18 16:20:45 user nova-compute[70975]: c16a352d-3f0c-4688-a890-81be1fee9f35 Apr 18 16:20:45 user nova-compute[70975]: c16a352d-3f0c-4688-a890-81be1fee9f35 Apr 18 16:20:45 user nova-compute[70975]: Virtual Machine Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: hvm Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Nehalem Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: /dev/urandom Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: Apr 18 16:20:45 user nova-compute[70975]: {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 18 16:20:45 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:20:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-1247400120',display_name='tempest-VolumesActionsTest-instance-1247400120',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesactionstest-instance-1247400120',id=18,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fb49d5ddf6db4b1d807ba42fd37a919d',ramdisk_id='',reservation_id='r-2ruy1cgq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesActionsTest-1956026439',owner_user_name='tempest-VolumesActionsTest-1956026439-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:20:43Z,user_data=None,user_id='3e222f73f9194870b4fac305cdd60f3a',uuid=c16a352d-3f0c-4688-a890-81be1fee9f35,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "159a1d71-92e3-4ac2-ad79-f530a49580e9", "address": "fa:16:3e:f9:70:73", "network": {"id": "a4882813-5e0c-44e4-b47a-bc4b69fbead5", "bridge": "br-int", "label": "tempest-VolumesActionsTest-2022397269-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "fb49d5ddf6db4b1d807ba42fd37a919d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap159a1d71-92", "ovs_interfaceid": "159a1d71-92e3-4ac2-ad79-f530a49580e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 18 16:20:45 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Converting VIF {"id": "159a1d71-92e3-4ac2-ad79-f530a49580e9", "address": "fa:16:3e:f9:70:73", "network": {"id": "a4882813-5e0c-44e4-b47a-bc4b69fbead5", "bridge": "br-int", "label": "tempest-VolumesActionsTest-2022397269-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "fb49d5ddf6db4b1d807ba42fd37a919d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap159a1d71-92", "ovs_interfaceid": "159a1d71-92e3-4ac2-ad79-f530a49580e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:20:45 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:70:73,bridge_name='br-int',has_traffic_filtering=True,id=159a1d71-92e3-4ac2-ad79-f530a49580e9,network=Network(a4882813-5e0c-44e4-b47a-bc4b69fbead5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap159a1d71-92') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:20:45 user nova-compute[70975]: DEBUG os_vif [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:70:73,bridge_name='br-int',has_traffic_filtering=True,id=159a1d71-92e3-4ac2-ad79-f530a49580e9,network=Network(a4882813-5e0c-44e4-b47a-bc4b69fbead5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap159a1d71-92') {{(pid=70975) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 18 16:20:45 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:45 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:20:45 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 18 16:20:45 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:45 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap159a1d71-92, may_exist=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:20:45 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap159a1d71-92, col_values=(('external_ids', {'iface-id': '159a1d71-92e3-4ac2-ad79-f530a49580e9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f9:70:73', 'vm-uuid': 'c16a352d-3f0c-4688-a890-81be1fee9f35'}),)) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:20:45 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:45 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:20:45 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:45 user nova-compute[70975]: INFO os_vif [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:70:73,bridge_name='br-int',has_traffic_filtering=True,id=159a1d71-92e3-4ac2-ad79-f530a49580e9,network=Network(a4882813-5e0c-44e4-b47a-bc4b69fbead5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap159a1d71-92') Apr 18 16:20:45 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] No BDM found with device name vda, not building metadata. {{(pid=70975) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 18 16:20:45 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] No VIF found with MAC fa:16:3e:f9:70:73, not building metadata {{(pid=70975) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 18 16:20:46 user nova-compute[70975]: DEBUG nova.network.neutron [req-063209df-1df0-44d8-99a6-bb9ecef9bc20 req-d254d26b-a11c-4af0-8e18-387fb6af96f3 service nova] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Updated VIF entry in instance network info cache for port 159a1d71-92e3-4ac2-ad79-f530a49580e9. {{(pid=70975) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 18 16:20:46 user nova-compute[70975]: DEBUG nova.network.neutron [req-063209df-1df0-44d8-99a6-bb9ecef9bc20 req-d254d26b-a11c-4af0-8e18-387fb6af96f3 service nova] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Updating instance_info_cache with network_info: [{"id": "159a1d71-92e3-4ac2-ad79-f530a49580e9", "address": "fa:16:3e:f9:70:73", "network": {"id": "a4882813-5e0c-44e4-b47a-bc4b69fbead5", "bridge": "br-int", "label": "tempest-VolumesActionsTest-2022397269-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "fb49d5ddf6db4b1d807ba42fd37a919d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap159a1d71-92", "ovs_interfaceid": "159a1d71-92e3-4ac2-ad79-f530a49580e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:20:46 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-063209df-1df0-44d8-99a6-bb9ecef9bc20 req-d254d26b-a11c-4af0-8e18-387fb6af96f3 service nova] Releasing lock "refresh_cache-c16a352d-3f0c-4688-a890-81be1fee9f35" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:20:46 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._sync_power_states {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:20:46 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:46 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:46 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:47 user nova-compute[70975]: DEBUG nova.compute.manager [req-1095e39a-f458-4a06-84fd-9a1f7a54fd84 req-16fd06e2-82c3-47e8-ac79-8709f50d0ac0 service nova] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Received event network-vif-plugged-159a1d71-92e3-4ac2-ad79-f530a49580e9 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:20:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-1095e39a-f458-4a06-84fd-9a1f7a54fd84 req-16fd06e2-82c3-47e8-ac79-8709f50d0ac0 service nova] Acquiring lock "c16a352d-3f0c-4688-a890-81be1fee9f35-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:20:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-1095e39a-f458-4a06-84fd-9a1f7a54fd84 req-16fd06e2-82c3-47e8-ac79-8709f50d0ac0 service nova] Lock "c16a352d-3f0c-4688-a890-81be1fee9f35-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.004s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:20:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-1095e39a-f458-4a06-84fd-9a1f7a54fd84 req-16fd06e2-82c3-47e8-ac79-8709f50d0ac0 service nova] Lock "c16a352d-3f0c-4688-a890-81be1fee9f35-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:20:47 user nova-compute[70975]: DEBUG nova.compute.manager [req-1095e39a-f458-4a06-84fd-9a1f7a54fd84 req-16fd06e2-82c3-47e8-ac79-8709f50d0ac0 service nova] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] No waiting events found dispatching network-vif-plugged-159a1d71-92e3-4ac2-ad79-f530a49580e9 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:20:47 user nova-compute[70975]: WARNING nova.compute.manager [req-1095e39a-f458-4a06-84fd-9a1f7a54fd84 req-16fd06e2-82c3-47e8-ac79-8709f50d0ac0 service nova] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Received unexpected event network-vif-plugged-159a1d71-92e3-4ac2-ad79-f530a49580e9 for instance with vm_state building and task_state spawning. Apr 18 16:20:47 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:47 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Triggering sync for uuid 1b530349-680e-4def-86ef-29c340efa175 {{(pid=70975) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 18 16:20:47 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Triggering sync for uuid 6528f05a-9f05-4f35-b991-687e4f47029e {{(pid=70975) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 18 16:20:47 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Triggering sync for uuid f5496c5f-292e-4912-991b-f834009e51a1 {{(pid=70975) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 18 16:20:47 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Triggering sync for uuid f6d6085d-9e15-4e29-ab90-3a8928971324 {{(pid=70975) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 18 16:20:47 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Triggering sync for uuid 5f4e6f9b-5413-4399-83ca-9bc78911db38 {{(pid=70975) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 18 16:20:47 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Triggering sync for uuid c16a352d-3f0c-4688-a890-81be1fee9f35 {{(pid=70975) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 18 16:20:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "1b530349-680e-4def-86ef-29c340efa175" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:20:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "1b530349-680e-4def-86ef-29c340efa175" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:20:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "6528f05a-9f05-4f35-b991-687e4f47029e" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:20:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "6528f05a-9f05-4f35-b991-687e4f47029e" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:20:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "f5496c5f-292e-4912-991b-f834009e51a1" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:20:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "f5496c5f-292e-4912-991b-f834009e51a1" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:20:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "f6d6085d-9e15-4e29-ab90-3a8928971324" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:20:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "f6d6085d-9e15-4e29-ab90-3a8928971324" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:20:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "5f4e6f9b-5413-4399-83ca-9bc78911db38" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:20:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "5f4e6f9b-5413-4399-83ca-9bc78911db38" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:20:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "c16a352d-3f0c-4688-a890-81be1fee9f35" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:20:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "6528f05a-9f05-4f35-b991-687e4f47029e" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.069s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:20:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "f5496c5f-292e-4912-991b-f834009e51a1" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.071s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:20:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "1b530349-680e-4def-86ef-29c340efa175" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.074s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:20:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "5f4e6f9b-5413-4399-83ca-9bc78911db38" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.072s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:20:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "f6d6085d-9e15-4e29-ab90-3a8928971324" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.079s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:20:47 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:49 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Resumed> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:20:49 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] VM Resumed (Lifecycle Event) Apr 18 16:20:49 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Instance event wait completed in 0 seconds for {{(pid=70975) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 18 16:20:49 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Guest created on hypervisor {{(pid=70975) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 18 16:20:49 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Instance spawned successfully. Apr 18 16:20:49 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 18 16:20:49 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:20:49 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:20:49 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Found default for hw_cdrom_bus of ide {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:20:49 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Found default for hw_disk_bus of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:20:49 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Found default for hw_input_bus of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:20:49 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Found default for hw_pointer_model of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:20:49 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Found default for hw_video_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:20:49 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Found default for hw_vif_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:20:49 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:20:49 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Started> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:20:49 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] VM Started (Lifecycle Event) Apr 18 16:20:49 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:20:49 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:20:49 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:20:49 user nova-compute[70975]: DEBUG nova.compute.manager [req-a4c507c7-ebad-4fd6-8931-93d24b37bb85 req-3fa5f43e-a6b5-4d50-b7ae-40a92eec8614 service nova] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Received event network-vif-plugged-159a1d71-92e3-4ac2-ad79-f530a49580e9 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:20:49 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-a4c507c7-ebad-4fd6-8931-93d24b37bb85 req-3fa5f43e-a6b5-4d50-b7ae-40a92eec8614 service nova] Acquiring lock "c16a352d-3f0c-4688-a890-81be1fee9f35-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:20:49 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-a4c507c7-ebad-4fd6-8931-93d24b37bb85 req-3fa5f43e-a6b5-4d50-b7ae-40a92eec8614 service nova] Lock "c16a352d-3f0c-4688-a890-81be1fee9f35-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:20:49 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-a4c507c7-ebad-4fd6-8931-93d24b37bb85 req-3fa5f43e-a6b5-4d50-b7ae-40a92eec8614 service nova] Lock "c16a352d-3f0c-4688-a890-81be1fee9f35-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:20:49 user nova-compute[70975]: DEBUG nova.compute.manager [req-a4c507c7-ebad-4fd6-8931-93d24b37bb85 req-3fa5f43e-a6b5-4d50-b7ae-40a92eec8614 service nova] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] No waiting events found dispatching network-vif-plugged-159a1d71-92e3-4ac2-ad79-f530a49580e9 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:20:49 user nova-compute[70975]: WARNING nova.compute.manager [req-a4c507c7-ebad-4fd6-8931-93d24b37bb85 req-3fa5f43e-a6b5-4d50-b7ae-40a92eec8614 service nova] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Received unexpected event network-vif-plugged-159a1d71-92e3-4ac2-ad79-f530a49580e9 for instance with vm_state building and task_state spawning. Apr 18 16:20:49 user nova-compute[70975]: INFO nova.compute.manager [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Took 6.38 seconds to spawn the instance on the hypervisor. Apr 18 16:20:49 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:20:49 user nova-compute[70975]: INFO nova.compute.manager [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Took 7.11 seconds to build instance. Apr 18 16:20:49 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-f7a2af34-cd5a-4437-8d90-212c6ca43a25 tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Lock "c16a352d-3f0c-4688-a890-81be1fee9f35" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.236s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:20:49 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "c16a352d-3f0c-4688-a890-81be1fee9f35" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 2.108s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:20:49 user nova-compute[70975]: INFO nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:20:49 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "c16a352d-3f0c-4688-a890-81be1fee9f35" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:20:50 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:50 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Acquiring lock "66df9389-d007-4737-8bb1-55bcb5f227ff" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:20:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lock "66df9389-d007-4737-8bb1-55bcb5f227ff" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:20:51 user nova-compute[70975]: DEBUG nova.compute.manager [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Starting instance... {{(pid=70975) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 18 16:20:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:20:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:20:51 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70975) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 18 16:20:51 user nova-compute[70975]: INFO nova.compute.claims [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Claim successful on node user Apr 18 16:20:52 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:52 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:20:52 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:20:52 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.703s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:20:52 user nova-compute[70975]: DEBUG nova.compute.manager [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Start building networks asynchronously for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 18 16:20:52 user nova-compute[70975]: DEBUG nova.compute.manager [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Allocating IP information in the background. {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 18 16:20:52 user nova-compute[70975]: DEBUG nova.network.neutron [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] allocate_for_instance() {{(pid=70975) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 18 16:20:52 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 18 16:20:52 user nova-compute[70975]: DEBUG nova.policy [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2963911de4f34d79816a9a1f9ad24a27', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5695adbb14ea4162bc40547b1509a1e4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70975) authorize /opt/stack/nova/nova/policy.py:203}} Apr 18 16:20:53 user nova-compute[70975]: DEBUG nova.compute.manager [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Start building block device mappings for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 18 16:20:53 user nova-compute[70975]: DEBUG nova.compute.manager [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Start spawning the instance on the hypervisor. {{(pid=70975) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 18 16:20:53 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Creating instance directory {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 18 16:20:53 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Creating image(s) Apr 18 16:20:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Acquiring lock "/opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:20:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lock "/opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:20:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lock "/opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:20:53 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:20:53 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.160s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:20:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Acquiring lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:20:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:20:53 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:20:53 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.164s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:20:53 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk 1073741824 {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:20:53 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk 1073741824" returned: 0 in 0.055s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:20:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.226s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:20:53 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:20:53 user nova-compute[70975]: DEBUG nova.network.neutron [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Successfully created port: b66d41ab-873c-4826-a3f8-d4f4276fff10 {{(pid=70975) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 18 16:20:53 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.175s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:20:53 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Checking if we can resize image /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk. size=1073741824 {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 18 16:20:53 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:20:54 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json" returned: 0 in 0.146s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:20:54 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Cannot resize image /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk to a smaller size. {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 18 16:20:54 user nova-compute[70975]: DEBUG nova.objects.instance [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lazy-loading 'migration_context' on Instance uuid 66df9389-d007-4737-8bb1-55bcb5f227ff {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:20:54 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Created local disks {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 18 16:20:54 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Ensure instance console log exists: /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/console.log {{(pid=70975) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 18 16:20:54 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:20:54 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:20:54 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:20:54 user nova-compute[70975]: DEBUG nova.network.neutron [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Successfully updated port: b66d41ab-873c-4826-a3f8-d4f4276fff10 {{(pid=70975) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 18 16:20:54 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Acquiring lock "refresh_cache-66df9389-d007-4737-8bb1-55bcb5f227ff" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:20:54 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Acquired lock "refresh_cache-66df9389-d007-4737-8bb1-55bcb5f227ff" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:20:54 user nova-compute[70975]: DEBUG nova.network.neutron [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Building network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 18 16:20:54 user nova-compute[70975]: DEBUG nova.compute.manager [req-d131cad2-e124-467e-b39c-d3e0cba48295 req-5f879130-854a-43b1-911d-df4784c10569 service nova] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Received event network-changed-b66d41ab-873c-4826-a3f8-d4f4276fff10 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:20:54 user nova-compute[70975]: DEBUG nova.compute.manager [req-d131cad2-e124-467e-b39c-d3e0cba48295 req-5f879130-854a-43b1-911d-df4784c10569 service nova] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Refreshing instance network info cache due to event network-changed-b66d41ab-873c-4826-a3f8-d4f4276fff10. {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 18 16:20:54 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-d131cad2-e124-467e-b39c-d3e0cba48295 req-5f879130-854a-43b1-911d-df4784c10569 service nova] Acquiring lock "refresh_cache-66df9389-d007-4737-8bb1-55bcb5f227ff" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:20:54 user nova-compute[70975]: DEBUG nova.network.neutron [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Instance cache missing network info. {{(pid=70975) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 18 16:20:55 user nova-compute[70975]: DEBUG nova.network.neutron [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Updating instance_info_cache with network_info: [{"id": "b66d41ab-873c-4826-a3f8-d4f4276fff10", "address": "fa:16:3e:58:32:25", "network": {"id": "236fa8aa-433b-4dfa-a787-f165c3389489", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1486162327-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5695adbb14ea4162bc40547b1509a1e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb66d41ab-87", "ovs_interfaceid": "b66d41ab-873c-4826-a3f8-d4f4276fff10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:20:55 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Releasing lock "refresh_cache-66df9389-d007-4737-8bb1-55bcb5f227ff" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:20:55 user nova-compute[70975]: DEBUG nova.compute.manager [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Instance network_info: |[{"id": "b66d41ab-873c-4826-a3f8-d4f4276fff10", "address": "fa:16:3e:58:32:25", "network": {"id": "236fa8aa-433b-4dfa-a787-f165c3389489", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1486162327-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5695adbb14ea4162bc40547b1509a1e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb66d41ab-87", "ovs_interfaceid": "b66d41ab-873c-4826-a3f8-d4f4276fff10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 18 16:20:55 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-d131cad2-e124-467e-b39c-d3e0cba48295 req-5f879130-854a-43b1-911d-df4784c10569 service nova] Acquired lock "refresh_cache-66df9389-d007-4737-8bb1-55bcb5f227ff" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:20:55 user nova-compute[70975]: DEBUG nova.network.neutron [req-d131cad2-e124-467e-b39c-d3e0cba48295 req-5f879130-854a-43b1-911d-df4784c10569 service nova] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Refreshing network info cache for port b66d41ab-873c-4826-a3f8-d4f4276fff10 {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 18 16:20:55 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Start _get_guest_xml network_info=[{"id": "b66d41ab-873c-4826-a3f8-d4f4276fff10", "address": "fa:16:3e:58:32:25", "network": {"id": "236fa8aa-433b-4dfa-a787-f165c3389489", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1486162327-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5695adbb14ea4162bc40547b1509a1e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb66d41ab-87", "ovs_interfaceid": "b66d41ab-873c-4826-a3f8-d4f4276fff10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encrypted': False, 'device_type': 'disk', 'encryption_secret_uuid': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'b11a20de-f82a-4158-b53e-0a0c7a1552cb'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 18 16:20:55 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:20:55 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:20:55 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70975) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 18 16:20:55 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-18T16:11:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=), allow threads: True {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 18 16:20:55 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Flavor limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 18 16:20:55 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Image limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 18 16:20:55 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Flavor pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 18 16:20:55 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Image pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 18 16:20:55 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 18 16:20:55 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 18 16:20:55 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 18 16:20:55 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Got 1 possible topologies {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 18 16:20:55 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 18 16:20:55 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 18 16:20:55 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:20:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1356234404',display_name='tempest-ServersNegativeTestJSON-server-1356234404',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1356234404',id=19,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5695adbb14ea4162bc40547b1509a1e4',ramdisk_id='',reservation_id='r-9ymcy2v8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1696086909',owner_user_name='tempest-ServersNegativeTestJSON-1696086909-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:20:53Z,user_data=None,user_id='2963911de4f34d79816a9a1f9ad24a27',uuid=66df9389-d007-4737-8bb1-55bcb5f227ff,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b66d41ab-873c-4826-a3f8-d4f4276fff10", "address": "fa:16:3e:58:32:25", "network": {"id": "236fa8aa-433b-4dfa-a787-f165c3389489", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1486162327-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5695adbb14ea4162bc40547b1509a1e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb66d41ab-87", "ovs_interfaceid": "b66d41ab-873c-4826-a3f8-d4f4276fff10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70975) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 18 16:20:55 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Converting VIF {"id": "b66d41ab-873c-4826-a3f8-d4f4276fff10", "address": "fa:16:3e:58:32:25", "network": {"id": "236fa8aa-433b-4dfa-a787-f165c3389489", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1486162327-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5695adbb14ea4162bc40547b1509a1e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb66d41ab-87", "ovs_interfaceid": "b66d41ab-873c-4826-a3f8-d4f4276fff10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:20:55 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:32:25,bridge_name='br-int',has_traffic_filtering=True,id=b66d41ab-873c-4826-a3f8-d4f4276fff10,network=Network(236fa8aa-433b-4dfa-a787-f165c3389489),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb66d41ab-87') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:20:55 user nova-compute[70975]: DEBUG nova.objects.instance [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lazy-loading 'pci_devices' on Instance uuid 66df9389-d007-4737-8bb1-55bcb5f227ff {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:20:55 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:55 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] End _get_guest_xml xml= Apr 18 16:20:55 user nova-compute[70975]: 66df9389-d007-4737-8bb1-55bcb5f227ff Apr 18 16:20:55 user nova-compute[70975]: instance-00000013 Apr 18 16:20:55 user nova-compute[70975]: 131072 Apr 18 16:20:55 user nova-compute[70975]: 1 Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: tempest-ServersNegativeTestJSON-server-1356234404 Apr 18 16:20:55 user nova-compute[70975]: 2023-04-18 16:20:55 Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: 128 Apr 18 16:20:55 user nova-compute[70975]: 1 Apr 18 16:20:55 user nova-compute[70975]: 0 Apr 18 16:20:55 user nova-compute[70975]: 0 Apr 18 16:20:55 user nova-compute[70975]: 1 Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: tempest-ServersNegativeTestJSON-1696086909-project-member Apr 18 16:20:55 user nova-compute[70975]: tempest-ServersNegativeTestJSON-1696086909 Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: OpenStack Foundation Apr 18 16:20:55 user nova-compute[70975]: OpenStack Nova Apr 18 16:20:55 user nova-compute[70975]: 0.0.0 Apr 18 16:20:55 user nova-compute[70975]: 66df9389-d007-4737-8bb1-55bcb5f227ff Apr 18 16:20:55 user nova-compute[70975]: 66df9389-d007-4737-8bb1-55bcb5f227ff Apr 18 16:20:55 user nova-compute[70975]: Virtual Machine Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: hvm Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Nehalem Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: /dev/urandom Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: Apr 18 16:20:55 user nova-compute[70975]: {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 18 16:20:55 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:20:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1356234404',display_name='tempest-ServersNegativeTestJSON-server-1356234404',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1356234404',id=19,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5695adbb14ea4162bc40547b1509a1e4',ramdisk_id='',reservation_id='r-9ymcy2v8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1696086909',owner_user_name='tempest-ServersNegativeTestJSON-1696086909-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:20:53Z,user_data=None,user_id='2963911de4f34d79816a9a1f9ad24a27',uuid=66df9389-d007-4737-8bb1-55bcb5f227ff,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b66d41ab-873c-4826-a3f8-d4f4276fff10", "address": "fa:16:3e:58:32:25", "network": {"id": "236fa8aa-433b-4dfa-a787-f165c3389489", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1486162327-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5695adbb14ea4162bc40547b1509a1e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb66d41ab-87", "ovs_interfaceid": "b66d41ab-873c-4826-a3f8-d4f4276fff10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 18 16:20:55 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Converting VIF {"id": "b66d41ab-873c-4826-a3f8-d4f4276fff10", "address": "fa:16:3e:58:32:25", "network": {"id": "236fa8aa-433b-4dfa-a787-f165c3389489", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1486162327-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5695adbb14ea4162bc40547b1509a1e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb66d41ab-87", "ovs_interfaceid": "b66d41ab-873c-4826-a3f8-d4f4276fff10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:20:55 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:32:25,bridge_name='br-int',has_traffic_filtering=True,id=b66d41ab-873c-4826-a3f8-d4f4276fff10,network=Network(236fa8aa-433b-4dfa-a787-f165c3389489),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb66d41ab-87') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:20:55 user nova-compute[70975]: DEBUG os_vif [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:32:25,bridge_name='br-int',has_traffic_filtering=True,id=b66d41ab-873c-4826-a3f8-d4f4276fff10,network=Network(236fa8aa-433b-4dfa-a787-f165c3389489),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb66d41ab-87') {{(pid=70975) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 18 16:20:55 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:55 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:20:55 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 18 16:20:55 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:55 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb66d41ab-87, may_exist=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:20:55 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb66d41ab-87, col_values=(('external_ids', {'iface-id': 'b66d41ab-873c-4826-a3f8-d4f4276fff10', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:58:32:25', 'vm-uuid': '66df9389-d007-4737-8bb1-55bcb5f227ff'}),)) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:20:55 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:55 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:20:55 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:55 user nova-compute[70975]: INFO os_vif [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:32:25,bridge_name='br-int',has_traffic_filtering=True,id=b66d41ab-873c-4826-a3f8-d4f4276fff10,network=Network(236fa8aa-433b-4dfa-a787-f165c3389489),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb66d41ab-87') Apr 18 16:20:55 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] No BDM found with device name vda, not building metadata. {{(pid=70975) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 18 16:20:55 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] No VIF found with MAC fa:16:3e:58:32:25, not building metadata {{(pid=70975) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 18 16:20:56 user nova-compute[70975]: DEBUG nova.network.neutron [req-d131cad2-e124-467e-b39c-d3e0cba48295 req-5f879130-854a-43b1-911d-df4784c10569 service nova] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Updated VIF entry in instance network info cache for port b66d41ab-873c-4826-a3f8-d4f4276fff10. {{(pid=70975) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 18 16:20:56 user nova-compute[70975]: DEBUG nova.network.neutron [req-d131cad2-e124-467e-b39c-d3e0cba48295 req-5f879130-854a-43b1-911d-df4784c10569 service nova] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Updating instance_info_cache with network_info: [{"id": "b66d41ab-873c-4826-a3f8-d4f4276fff10", "address": "fa:16:3e:58:32:25", "network": {"id": "236fa8aa-433b-4dfa-a787-f165c3389489", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1486162327-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5695adbb14ea4162bc40547b1509a1e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb66d41ab-87", "ovs_interfaceid": "b66d41ab-873c-4826-a3f8-d4f4276fff10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:20:56 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-d131cad2-e124-467e-b39c-d3e0cba48295 req-5f879130-854a-43b1-911d-df4784c10569 service nova] Releasing lock "refresh_cache-66df9389-d007-4737-8bb1-55bcb5f227ff" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:20:56 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:56 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:56 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:56 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:57 user nova-compute[70975]: DEBUG nova.compute.manager [req-c44549c5-6c55-4f91-bf8c-bdfa41737b51 req-6836aad1-e2a1-4bac-aca8-7e865ede39a3 service nova] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Received event network-vif-plugged-b66d41ab-873c-4826-a3f8-d4f4276fff10 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:20:57 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-c44549c5-6c55-4f91-bf8c-bdfa41737b51 req-6836aad1-e2a1-4bac-aca8-7e865ede39a3 service nova] Acquiring lock "66df9389-d007-4737-8bb1-55bcb5f227ff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:20:57 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-c44549c5-6c55-4f91-bf8c-bdfa41737b51 req-6836aad1-e2a1-4bac-aca8-7e865ede39a3 service nova] Lock "66df9389-d007-4737-8bb1-55bcb5f227ff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.003s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:20:57 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-c44549c5-6c55-4f91-bf8c-bdfa41737b51 req-6836aad1-e2a1-4bac-aca8-7e865ede39a3 service nova] Lock "66df9389-d007-4737-8bb1-55bcb5f227ff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.005s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:20:57 user nova-compute[70975]: DEBUG nova.compute.manager [req-c44549c5-6c55-4f91-bf8c-bdfa41737b51 req-6836aad1-e2a1-4bac-aca8-7e865ede39a3 service nova] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] No waiting events found dispatching network-vif-plugged-b66d41ab-873c-4826-a3f8-d4f4276fff10 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:20:57 user nova-compute[70975]: WARNING nova.compute.manager [req-c44549c5-6c55-4f91-bf8c-bdfa41737b51 req-6836aad1-e2a1-4bac-aca8-7e865ede39a3 service nova] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Received unexpected event network-vif-plugged-b66d41ab-873c-4826-a3f8-d4f4276fff10 for instance with vm_state building and task_state spawning. Apr 18 16:20:57 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:57 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:57 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:57 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:20:58 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Acquiring lock "5e79a758-6aed-4536-bb5d-1a905ec5d28d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:20:58 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "5e79a758-6aed-4536-bb5d-1a905ec5d28d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:20:58 user nova-compute[70975]: DEBUG nova.compute.manager [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] Starting instance... {{(pid=70975) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 18 16:20:58 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:20:58 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:20:58 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70975) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 18 16:20:58 user nova-compute[70975]: INFO nova.compute.claims [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] Claim successful on node user Apr 18 16:20:58 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Resumed> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:20:58 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] VM Resumed (Lifecycle Event) Apr 18 16:20:58 user nova-compute[70975]: DEBUG nova.compute.manager [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Instance event wait completed in 0 seconds for {{(pid=70975) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 18 16:20:58 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Guest created on hypervisor {{(pid=70975) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 18 16:20:58 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Instance spawned successfully. Apr 18 16:20:58 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 18 16:20:58 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:20:58 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:20:59 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Found default for hw_cdrom_bus of ide {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:20:59 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Found default for hw_disk_bus of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:20:59 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Found default for hw_input_bus of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:20:59 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Found default for hw_pointer_model of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:20:59 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Found default for hw_video_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:20:59 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Found default for hw_vif_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:20:59 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:20:59 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Started> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:20:59 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] VM Started (Lifecycle Event) Apr 18 16:20:59 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:20:59 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:20:59 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:20:59 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:20:59 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:20:59 user nova-compute[70975]: INFO nova.compute.manager [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Took 5.75 seconds to spawn the instance on the hypervisor. Apr 18 16:20:59 user nova-compute[70975]: DEBUG nova.compute.manager [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:20:59 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.396s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:20:59 user nova-compute[70975]: DEBUG nova.compute.manager [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] Start building networks asynchronously for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 18 16:20:59 user nova-compute[70975]: DEBUG nova.compute.manager [req-cd61bc88-a47a-4739-93db-a1df1f6e863d req-046aa720-f40f-4f0e-a1c1-b17ca7792190 service nova] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Received event network-vif-plugged-b66d41ab-873c-4826-a3f8-d4f4276fff10 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:20:59 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-cd61bc88-a47a-4739-93db-a1df1f6e863d req-046aa720-f40f-4f0e-a1c1-b17ca7792190 service nova] Acquiring lock "66df9389-d007-4737-8bb1-55bcb5f227ff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:20:59 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-cd61bc88-a47a-4739-93db-a1df1f6e863d req-046aa720-f40f-4f0e-a1c1-b17ca7792190 service nova] Lock "66df9389-d007-4737-8bb1-55bcb5f227ff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:20:59 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-cd61bc88-a47a-4739-93db-a1df1f6e863d req-046aa720-f40f-4f0e-a1c1-b17ca7792190 service nova] Lock "66df9389-d007-4737-8bb1-55bcb5f227ff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:20:59 user nova-compute[70975]: DEBUG nova.compute.manager [req-cd61bc88-a47a-4739-93db-a1df1f6e863d req-046aa720-f40f-4f0e-a1c1-b17ca7792190 service nova] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] No waiting events found dispatching network-vif-plugged-b66d41ab-873c-4826-a3f8-d4f4276fff10 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:20:59 user nova-compute[70975]: WARNING nova.compute.manager [req-cd61bc88-a47a-4739-93db-a1df1f6e863d req-046aa720-f40f-4f0e-a1c1-b17ca7792190 service nova] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Received unexpected event network-vif-plugged-b66d41ab-873c-4826-a3f8-d4f4276fff10 for instance with vm_state building and task_state spawning. Apr 18 16:20:59 user nova-compute[70975]: INFO nova.compute.manager [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Took 7.31 seconds to build instance. Apr 18 16:20:59 user nova-compute[70975]: DEBUG nova.compute.manager [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] Allocating IP information in the background. {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 18 16:20:59 user nova-compute[70975]: DEBUG nova.network.neutron [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] allocate_for_instance() {{(pid=70975) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 18 16:20:59 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 18 16:20:59 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-cd985294-337e-4f05-8b4b-895bb806307b tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lock "66df9389-d007-4737-8bb1-55bcb5f227ff" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.412s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:20:59 user nova-compute[70975]: DEBUG nova.compute.manager [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] Start building block device mappings for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 18 16:20:59 user nova-compute[70975]: INFO nova.virt.block_device [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] Booting with blank volume at /dev/vda Apr 18 16:20:59 user nova-compute[70975]: DEBUG nova.policy [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c54c277689214bd0a2cadb1e2ac288a9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f516f5ec45ca4508841c77f79e8c038b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70975) authorize /opt/stack/nova/nova/policy.py:203}} Apr 18 16:20:59 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:20:59 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:20:59 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70975) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 18 16:21:00 user nova-compute[70975]: DEBUG nova.network.neutron [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] Successfully created port: 04c99cbe-321c-4c83-a015-4ad99a7b84a8 {{(pid=70975) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 18 16:21:00 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:00 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:21:00 user nova-compute[70975]: DEBUG nova.network.neutron [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] Successfully updated port: 04c99cbe-321c-4c83-a015-4ad99a7b84a8 {{(pid=70975) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 18 16:21:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Acquiring lock "refresh_cache-5e79a758-6aed-4536-bb5d-1a905ec5d28d" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:21:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Acquired lock "refresh_cache-5e79a758-6aed-4536-bb5d-1a905ec5d28d" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:21:00 user nova-compute[70975]: DEBUG nova.network.neutron [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] Building network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 18 16:21:00 user nova-compute[70975]: DEBUG nova.compute.manager [req-3e287364-413b-4600-972a-63cd5beca62f req-72e0c7a8-d825-4344-b67f-3c50e25718bb service nova] [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] Received event network-changed-04c99cbe-321c-4c83-a015-4ad99a7b84a8 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:21:00 user nova-compute[70975]: DEBUG nova.compute.manager [req-3e287364-413b-4600-972a-63cd5beca62f req-72e0c7a8-d825-4344-b67f-3c50e25718bb service nova] [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] Refreshing instance network info cache due to event network-changed-04c99cbe-321c-4c83-a015-4ad99a7b84a8. {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 18 16:21:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-3e287364-413b-4600-972a-63cd5beca62f req-72e0c7a8-d825-4344-b67f-3c50e25718bb service nova] Acquiring lock "refresh_cache-5e79a758-6aed-4536-bb5d-1a905ec5d28d" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:21:00 user nova-compute[70975]: DEBUG nova.network.neutron [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] Instance cache missing network info. {{(pid=70975) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 18 16:21:01 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:21:01 user nova-compute[70975]: DEBUG nova.network.neutron [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] Updating instance_info_cache with network_info: [{"id": "04c99cbe-321c-4c83-a015-4ad99a7b84a8", "address": "fa:16:3e:8f:aa:fb", "network": {"id": "923d10dc-c67e-4426-9c6e-856e903e2446", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-131274818-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f516f5ec45ca4508841c77f79e8c038b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap04c99cbe-32", "ovs_interfaceid": "04c99cbe-321c-4c83-a015-4ad99a7b84a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:21:01 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Releasing lock "refresh_cache-5e79a758-6aed-4536-bb5d-1a905ec5d28d" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:21:01 user nova-compute[70975]: DEBUG nova.compute.manager [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] Instance network_info: |[{"id": "04c99cbe-321c-4c83-a015-4ad99a7b84a8", "address": "fa:16:3e:8f:aa:fb", "network": {"id": "923d10dc-c67e-4426-9c6e-856e903e2446", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-131274818-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f516f5ec45ca4508841c77f79e8c038b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap04c99cbe-32", "ovs_interfaceid": "04c99cbe-321c-4c83-a015-4ad99a7b84a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 18 16:21:01 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-3e287364-413b-4600-972a-63cd5beca62f req-72e0c7a8-d825-4344-b67f-3c50e25718bb service nova] Acquired lock "refresh_cache-5e79a758-6aed-4536-bb5d-1a905ec5d28d" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:21:01 user nova-compute[70975]: DEBUG nova.network.neutron [req-3e287364-413b-4600-972a-63cd5beca62f req-72e0c7a8-d825-4344-b67f-3c50e25718bb service nova] [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] Refreshing network info cache for port 04c99cbe-321c-4c83-a015-4ad99a7b84a8 {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 18 16:21:02 user nova-compute[70975]: DEBUG nova.network.neutron [req-3e287364-413b-4600-972a-63cd5beca62f req-72e0c7a8-d825-4344-b67f-3c50e25718bb service nova] [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] Updated VIF entry in instance network info cache for port 04c99cbe-321c-4c83-a015-4ad99a7b84a8. {{(pid=70975) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 18 16:21:02 user nova-compute[70975]: DEBUG nova.network.neutron [req-3e287364-413b-4600-972a-63cd5beca62f req-72e0c7a8-d825-4344-b67f-3c50e25718bb service nova] [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] Updating instance_info_cache with network_info: [{"id": "04c99cbe-321c-4c83-a015-4ad99a7b84a8", "address": "fa:16:3e:8f:aa:fb", "network": {"id": "923d10dc-c67e-4426-9c6e-856e903e2446", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-131274818-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f516f5ec45ca4508841c77f79e8c038b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap04c99cbe-32", "ovs_interfaceid": "04c99cbe-321c-4c83-a015-4ad99a7b84a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:21:02 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-3e287364-413b-4600-972a-63cd5beca62f req-72e0c7a8-d825-4344-b67f-3c50e25718bb service nova] Releasing lock "refresh_cache-5e79a758-6aed-4536-bb5d-1a905ec5d28d" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:21:02 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:21:02 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:21:02 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager.update_available_resource {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:21:02 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:21:02 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:21:02 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:21:02 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Auditing locally available compute resources for user (node: user) {{(pid=70975) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 18 16:21:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:21:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk --force-share --output=json" returned: 0 in 0.151s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:21:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:21:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:21:02 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f6d6085d-9e15-4e29-ab90-3a8928971324/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:21:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f6d6085d-9e15-4e29-ab90-3a8928971324/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:21:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f6d6085d-9e15-4e29-ab90-3a8928971324/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:21:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f6d6085d-9e15-4e29-ab90-3a8928971324/disk --force-share --output=json" returned: 0 in 0.145s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:21:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c16a352d-3f0c-4688-a890-81be1fee9f35/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:21:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c16a352d-3f0c-4688-a890-81be1fee9f35/disk --force-share --output=json" returned: 0 in 0.149s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:21:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c16a352d-3f0c-4688-a890-81be1fee9f35/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:21:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c16a352d-3f0c-4688-a890-81be1fee9f35/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:21:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5f4e6f9b-5413-4399-83ca-9bc78911db38/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:21:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5f4e6f9b-5413-4399-83ca-9bc78911db38/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:21:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5f4e6f9b-5413-4399-83ca-9bc78911db38/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:21:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5f4e6f9b-5413-4399-83ca-9bc78911db38/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:21:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:21:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json" returned: 0 in 0.127s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:21:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:21:04 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:21:04 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:21:04 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:21:04 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:21:04 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json" returned: 0 in 0.128s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:21:04 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f5496c5f-292e-4912-991b-f834009e51a1/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:21:04 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f5496c5f-292e-4912-991b-f834009e51a1/disk --force-share --output=json" returned: 0 in 0.128s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:21:04 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f5496c5f-292e-4912-991b-f834009e51a1/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:21:04 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f5496c5f-292e-4912-991b-f834009e51a1/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:21:05 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:21:05 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:21:05 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Hypervisor/Node resource view: name=user free_ram=8245MB free_disk=26.54990005493164GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70975) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 18 16:21:05 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:21:05 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:21:05 user nova-compute[70975]: WARNING nova.compute.manager [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Volume id: 63fe53bd-9515-42b1-9d44-6baa7b7fc83f finished being created but its status is error. Apr 18 16:21:05 user nova-compute[70975]: ERROR nova.compute.manager [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] Instance failed block device setup: nova.exception.VolumeNotCreated: Volume 63fe53bd-9515-42b1-9d44-6baa7b7fc83f did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error. Apr 18 16:21:05 user nova-compute[70975]: ERROR nova.compute.manager [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] Traceback (most recent call last): Apr 18 16:21:05 user nova-compute[70975]: ERROR nova.compute.manager [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] File "/opt/stack/nova/nova/compute/manager.py", line 2175, in _prep_block_device Apr 18 16:21:05 user nova-compute[70975]: ERROR nova.compute.manager [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] driver_block_device.attach_block_devices( Apr 18 16:21:05 user nova-compute[70975]: ERROR nova.compute.manager [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] File "/opt/stack/nova/nova/virt/block_device.py", line 936, in attach_block_devices Apr 18 16:21:05 user nova-compute[70975]: ERROR nova.compute.manager [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] _log_and_attach(device) Apr 18 16:21:05 user nova-compute[70975]: ERROR nova.compute.manager [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] File "/opt/stack/nova/nova/virt/block_device.py", line 933, in _log_and_attach Apr 18 16:21:05 user nova-compute[70975]: ERROR nova.compute.manager [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] bdm.attach(*attach_args, **attach_kwargs) Apr 18 16:21:05 user nova-compute[70975]: ERROR nova.compute.manager [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] File "/opt/stack/nova/nova/virt/block_device.py", line 848, in attach Apr 18 16:21:05 user nova-compute[70975]: ERROR nova.compute.manager [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] self.volume_id, self.attachment_id = self._create_volume( Apr 18 16:21:05 user nova-compute[70975]: ERROR nova.compute.manager [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] File "/opt/stack/nova/nova/virt/block_device.py", line 435, in _create_volume Apr 18 16:21:05 user nova-compute[70975]: ERROR nova.compute.manager [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] self._call_wait_func(context, wait_func, volume_api, vol['id']) Apr 18 16:21:05 user nova-compute[70975]: ERROR nova.compute.manager [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] File "/opt/stack/nova/nova/virt/block_device.py", line 785, in _call_wait_func Apr 18 16:21:05 user nova-compute[70975]: ERROR nova.compute.manager [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] with excutils.save_and_reraise_exception(): Apr 18 16:21:05 user nova-compute[70975]: ERROR nova.compute.manager [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ Apr 18 16:21:05 user nova-compute[70975]: ERROR nova.compute.manager [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] self.force_reraise() Apr 18 16:21:05 user nova-compute[70975]: ERROR nova.compute.manager [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise Apr 18 16:21:05 user nova-compute[70975]: ERROR nova.compute.manager [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] raise self.value Apr 18 16:21:05 user nova-compute[70975]: ERROR nova.compute.manager [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] File "/opt/stack/nova/nova/virt/block_device.py", line 783, in _call_wait_func Apr 18 16:21:05 user nova-compute[70975]: ERROR nova.compute.manager [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] wait_func(context, volume_id) Apr 18 16:21:05 user nova-compute[70975]: ERROR nova.compute.manager [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] File "/opt/stack/nova/nova/compute/manager.py", line 1792, in _await_block_device_map_created Apr 18 16:21:05 user nova-compute[70975]: ERROR nova.compute.manager [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] raise exception.VolumeNotCreated(volume_id=vol_id, Apr 18 16:21:05 user nova-compute[70975]: ERROR nova.compute.manager [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] nova.exception.VolumeNotCreated: Volume 63fe53bd-9515-42b1-9d44-6baa7b7fc83f did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error. Apr 18 16:21:05 user nova-compute[70975]: ERROR nova.compute.manager [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] Apr 18 16:21:05 user nova-compute[70975]: DEBUG nova.compute.claims [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] Aborting claim: {{(pid=70975) abort /opt/stack/nova/nova/compute/claims.py:84}} Apr 18 16:21:05 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:21:05 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 1b530349-680e-4def-86ef-29c340efa175 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:21:05 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 6528f05a-9f05-4f35-b991-687e4f47029e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:21:05 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance f5496c5f-292e-4912-991b-f834009e51a1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:21:05 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance f6d6085d-9e15-4e29-ab90-3a8928971324 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:21:05 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 5f4e6f9b-5413-4399-83ca-9bc78911db38 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:21:05 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance c16a352d-3f0c-4688-a890-81be1fee9f35 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:21:05 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 66df9389-d007-4737-8bb1-55bcb5f227ff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:21:05 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 5e79a758-6aed-4536-bb5d-1a905ec5d28d actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:21:05 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Total usable vcpus: 12, total allocated vcpus: 8 {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 18 16:21:05 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Final resource view: name=user phys_ram=16023MB used_ram=1536MB phys_disk=40GB used_disk=7GB total_vcpus=12 used_vcpus=8 pci_stats=[] {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 18 16:21:05 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:05 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:21:05 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:21:05 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Compute_service record updated for user:user {{(pid=70975) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 18 16:21:05 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.554s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:21:05 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.475s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:21:05 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:21:05 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:21:05 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.323s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:21:05 user nova-compute[70975]: DEBUG nova.compute.manager [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] Build of instance 5e79a758-6aed-4536-bb5d-1a905ec5d28d aborted: Volume 63fe53bd-9515-42b1-9d44-6baa7b7fc83f did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error. {{(pid=70975) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2636}} Apr 18 16:21:05 user nova-compute[70975]: DEBUG nova.compute.utils [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] Build of instance 5e79a758-6aed-4536-bb5d-1a905ec5d28d aborted: Volume 63fe53bd-9515-42b1-9d44-6baa7b7fc83f did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error. {{(pid=70975) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} Apr 18 16:21:05 user nova-compute[70975]: ERROR nova.compute.manager [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] Build of instance 5e79a758-6aed-4536-bb5d-1a905ec5d28d aborted: Volume 63fe53bd-9515-42b1-9d44-6baa7b7fc83f did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error.: nova.exception.BuildAbortException: Build of instance 5e79a758-6aed-4536-bb5d-1a905ec5d28d aborted: Volume 63fe53bd-9515-42b1-9d44-6baa7b7fc83f did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error. Apr 18 16:21:05 user nova-compute[70975]: DEBUG nova.compute.manager [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] Unplugging VIFs for instance {{(pid=70975) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} Apr 18 16:21:05 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:20:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-934050090',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-934050090',id=20,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f516f5ec45ca4508841c77f79e8c038b',ramdisk_id='',reservation_id='r-9vtlum7e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-2021464272',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member'},tags=TagList,task_state='block_device_mapping',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:20:59Z,user_data=None,user_id='c54c277689214bd0a2cadb1e2ac288a9',uuid=5e79a758-6aed-4536-bb5d-1a905ec5d28d,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "04c99cbe-321c-4c83-a015-4ad99a7b84a8", "address": "fa:16:3e:8f:aa:fb", "network": {"id": "923d10dc-c67e-4426-9c6e-856e903e2446", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-131274818-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f516f5ec45ca4508841c77f79e8c038b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap04c99cbe-32", "ovs_interfaceid": "04c99cbe-321c-4c83-a015-4ad99a7b84a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 18 16:21:05 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Converting VIF {"id": "04c99cbe-321c-4c83-a015-4ad99a7b84a8", "address": "fa:16:3e:8f:aa:fb", "network": {"id": "923d10dc-c67e-4426-9c6e-856e903e2446", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-131274818-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f516f5ec45ca4508841c77f79e8c038b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap04c99cbe-32", "ovs_interfaceid": "04c99cbe-321c-4c83-a015-4ad99a7b84a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:21:05 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8f:aa:fb,bridge_name='br-int',has_traffic_filtering=True,id=04c99cbe-321c-4c83-a015-4ad99a7b84a8,network=Network(923d10dc-c67e-4426-9c6e-856e903e2446),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04c99cbe-32') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:21:05 user nova-compute[70975]: DEBUG os_vif [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:aa:fb,bridge_name='br-int',has_traffic_filtering=True,id=04c99cbe-321c-4c83-a015-4ad99a7b84a8,network=Network(923d10dc-c67e-4426-9c6e-856e903e2446),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04c99cbe-32') {{(pid=70975) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 18 16:21:05 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:05 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap04c99cbe-32, bridge=br-int, if_exists=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:21:05 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 18 16:21:05 user nova-compute[70975]: INFO os_vif [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:aa:fb,bridge_name='br-int',has_traffic_filtering=True,id=04c99cbe-321c-4c83-a015-4ad99a7b84a8,network=Network(923d10dc-c67e-4426-9c6e-856e903e2446),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04c99cbe-32') Apr 18 16:21:05 user nova-compute[70975]: DEBUG nova.compute.manager [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] Unplugged VIFs for instance {{(pid=70975) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} Apr 18 16:21:05 user nova-compute[70975]: DEBUG nova.compute.manager [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] Deallocating network for instance {{(pid=70975) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 18 16:21:05 user nova-compute[70975]: DEBUG nova.network.neutron [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] deallocate_for_instance() {{(pid=70975) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 18 16:21:06 user nova-compute[70975]: DEBUG nova.network.neutron [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] Updating instance_info_cache with network_info: [] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:21:06 user nova-compute[70975]: INFO nova.compute.manager [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 5e79a758-6aed-4536-bb5d-1a905ec5d28d] Took 0.60 seconds to deallocate network for instance. Apr 18 16:21:06 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:21:06 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Starting heal instance info cache {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 18 16:21:06 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "refresh_cache-6528f05a-9f05-4f35-b991-687e4f47029e" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:21:06 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquired lock "refresh_cache-6528f05a-9f05-4f35-b991-687e4f47029e" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:21:06 user nova-compute[70975]: DEBUG nova.network.neutron [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Forcefully refreshing network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 18 16:21:06 user nova-compute[70975]: INFO nova.scheduler.client.report [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Deleted allocations for instance 5e79a758-6aed-4536-bb5d-1a905ec5d28d Apr 18 16:21:06 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3b4d31ce-6713-473e-8e4a-51f0f3371c7c tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "5e79a758-6aed-4536-bb5d-1a905ec5d28d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 8.148s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:21:07 user nova-compute[70975]: DEBUG nova.network.neutron [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Updating instance_info_cache with network_info: [{"id": "08164ae1-ace4-4d80-ad79-1741eacfa16e", "address": "fa:16:3e:28:00:5b", "network": {"id": "923d10dc-c67e-4426-9c6e-856e903e2446", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-131274818-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f516f5ec45ca4508841c77f79e8c038b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap08164ae1-ac", "ovs_interfaceid": "08164ae1-ace4-4d80-ad79-1741eacfa16e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:21:07 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Releasing lock "refresh_cache-6528f05a-9f05-4f35-b991-687e4f47029e" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:21:07 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Updated the network info_cache for instance {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 18 16:21:07 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:21:07 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:10 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:12 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:15 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:20 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:22 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:25 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:27 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:30 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:30 user nova-compute[70975]: DEBUG nova.compute.manager [req-f9a75b2c-80cd-4dda-b672-af07df54f993 req-23617ec2-f1c8-4cc2-bebc-339833ba64ab service nova] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Received event network-changed-4d7beeed-1a0b-490b-a788-3b8442f86758 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:21:30 user nova-compute[70975]: DEBUG nova.compute.manager [req-f9a75b2c-80cd-4dda-b672-af07df54f993 req-23617ec2-f1c8-4cc2-bebc-339833ba64ab service nova] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Refreshing instance network info cache due to event network-changed-4d7beeed-1a0b-490b-a788-3b8442f86758. {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 18 16:21:30 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-f9a75b2c-80cd-4dda-b672-af07df54f993 req-23617ec2-f1c8-4cc2-bebc-339833ba64ab service nova] Acquiring lock "refresh_cache-f5496c5f-292e-4912-991b-f834009e51a1" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:21:30 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-f9a75b2c-80cd-4dda-b672-af07df54f993 req-23617ec2-f1c8-4cc2-bebc-339833ba64ab service nova] Acquired lock "refresh_cache-f5496c5f-292e-4912-991b-f834009e51a1" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:21:30 user nova-compute[70975]: DEBUG nova.network.neutron [req-f9a75b2c-80cd-4dda-b672-af07df54f993 req-23617ec2-f1c8-4cc2-bebc-339833ba64ab service nova] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Refreshing network info cache for port 4d7beeed-1a0b-490b-a788-3b8442f86758 {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 18 16:21:31 user nova-compute[70975]: DEBUG nova.network.neutron [req-f9a75b2c-80cd-4dda-b672-af07df54f993 req-23617ec2-f1c8-4cc2-bebc-339833ba64ab service nova] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Updated VIF entry in instance network info cache for port 4d7beeed-1a0b-490b-a788-3b8442f86758. {{(pid=70975) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 18 16:21:31 user nova-compute[70975]: DEBUG nova.network.neutron [req-f9a75b2c-80cd-4dda-b672-af07df54f993 req-23617ec2-f1c8-4cc2-bebc-339833ba64ab service nova] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Updating instance_info_cache with network_info: [{"id": "4d7beeed-1a0b-490b-a788-3b8442f86758", "address": "fa:16:3e:fa:7c:e4", "network": {"id": "51cddd0f-0e4b-4d37-be40-ce5592263bc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1803491920-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f9987eeaa6b24ca48e80e8d5318f02ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d7beeed-1a", "ovs_interfaceid": "4d7beeed-1a0b-490b-a788-3b8442f86758", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:21:31 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-f9a75b2c-80cd-4dda-b672-af07df54f993 req-23617ec2-f1c8-4cc2-bebc-339833ba64ab service nova] Releasing lock "refresh_cache-f5496c5f-292e-4912-991b-f834009e51a1" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:21:32 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ed1c6b26-2580-404e-8880-13635bfc33a5 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Acquiring lock "f5496c5f-292e-4912-991b-f834009e51a1" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:21:32 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ed1c6b26-2580-404e-8880-13635bfc33a5 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lock "f5496c5f-292e-4912-991b-f834009e51a1" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:21:32 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ed1c6b26-2580-404e-8880-13635bfc33a5 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Acquiring lock "f5496c5f-292e-4912-991b-f834009e51a1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:21:32 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ed1c6b26-2580-404e-8880-13635bfc33a5 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lock "f5496c5f-292e-4912-991b-f834009e51a1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:21:32 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ed1c6b26-2580-404e-8880-13635bfc33a5 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lock "f5496c5f-292e-4912-991b-f834009e51a1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:21:32 user nova-compute[70975]: INFO nova.compute.manager [None req-ed1c6b26-2580-404e-8880-13635bfc33a5 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Terminating instance Apr 18 16:21:32 user nova-compute[70975]: DEBUG nova.compute.manager [None req-ed1c6b26-2580-404e-8880-13635bfc33a5 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Start destroying the instance on the hypervisor. {{(pid=70975) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 18 16:21:32 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:32 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:32 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:32 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:32 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:32 user nova-compute[70975]: DEBUG nova.compute.manager [req-5687a1fb-92ad-4df3-a163-e9d7c71aca9c req-3cc797aa-297e-4bfb-b2a3-10f0188ffdb2 service nova] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Received event network-vif-unplugged-4d7beeed-1a0b-490b-a788-3b8442f86758 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:21:32 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-5687a1fb-92ad-4df3-a163-e9d7c71aca9c req-3cc797aa-297e-4bfb-b2a3-10f0188ffdb2 service nova] Acquiring lock "f5496c5f-292e-4912-991b-f834009e51a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:21:32 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-5687a1fb-92ad-4df3-a163-e9d7c71aca9c req-3cc797aa-297e-4bfb-b2a3-10f0188ffdb2 service nova] Lock "f5496c5f-292e-4912-991b-f834009e51a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:21:32 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-5687a1fb-92ad-4df3-a163-e9d7c71aca9c req-3cc797aa-297e-4bfb-b2a3-10f0188ffdb2 service nova] Lock "f5496c5f-292e-4912-991b-f834009e51a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:21:32 user nova-compute[70975]: DEBUG nova.compute.manager [req-5687a1fb-92ad-4df3-a163-e9d7c71aca9c req-3cc797aa-297e-4bfb-b2a3-10f0188ffdb2 service nova] [instance: f5496c5f-292e-4912-991b-f834009e51a1] No waiting events found dispatching network-vif-unplugged-4d7beeed-1a0b-490b-a788-3b8442f86758 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:21:32 user nova-compute[70975]: DEBUG nova.compute.manager [req-5687a1fb-92ad-4df3-a163-e9d7c71aca9c req-3cc797aa-297e-4bfb-b2a3-10f0188ffdb2 service nova] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Received event network-vif-unplugged-4d7beeed-1a0b-490b-a788-3b8442f86758 for instance with task_state deleting. {{(pid=70975) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 18 16:21:33 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Instance destroyed successfully. Apr 18 16:21:33 user nova-compute[70975]: DEBUG nova.objects.instance [None req-ed1c6b26-2580-404e-8880-13635bfc33a5 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lazy-loading 'resources' on Instance uuid f5496c5f-292e-4912-991b-f834009e51a1 {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:21:33 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-ed1c6b26-2580-404e-8880-13635bfc33a5 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:19:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1134586344',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-1134586344',id=15,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ/t2zRXHckg0I8s8IBMQHTN6DHAOZW32I+dgNI9pg+HkbIaOsxkar0QwwPFIjcioaOE616z5xMRZ4Ihxh2dkemRrU9uEbk/jjZSUa1gm6kQJDS4/DyUt2ZBHtNG3kEG3g==',key_name='tempest-keypair-1613556039',keypairs=,launch_index=0,launched_at=2023-04-18T16:19:47Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='f9987eeaa6b24ca48e80e8d5318f02ac',ramdisk_id='',reservation_id='r-7mk95l5u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeShelveTestJSON-1663710151',owner_user_name='tempest-AttachVolumeShelveTestJSON-1663710151-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-18T16:19:47Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='73a99bbf510f4f67bb7a35901ba3edc5',uuid=f5496c5f-292e-4912-991b-f834009e51a1,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4d7beeed-1a0b-490b-a788-3b8442f86758", "address": "fa:16:3e:fa:7c:e4", "network": {"id": "51cddd0f-0e4b-4d37-be40-ce5592263bc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1803491920-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f9987eeaa6b24ca48e80e8d5318f02ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d7beeed-1a", "ovs_interfaceid": "4d7beeed-1a0b-490b-a788-3b8442f86758", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 18 16:21:33 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-ed1c6b26-2580-404e-8880-13635bfc33a5 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Converting VIF {"id": "4d7beeed-1a0b-490b-a788-3b8442f86758", "address": "fa:16:3e:fa:7c:e4", "network": {"id": "51cddd0f-0e4b-4d37-be40-ce5592263bc2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1803491920-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f9987eeaa6b24ca48e80e8d5318f02ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d7beeed-1a", "ovs_interfaceid": "4d7beeed-1a0b-490b-a788-3b8442f86758", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:21:33 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-ed1c6b26-2580-404e-8880-13635bfc33a5 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fa:7c:e4,bridge_name='br-int',has_traffic_filtering=True,id=4d7beeed-1a0b-490b-a788-3b8442f86758,network=Network(51cddd0f-0e4b-4d37-be40-ce5592263bc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d7beeed-1a') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:21:33 user nova-compute[70975]: DEBUG os_vif [None req-ed1c6b26-2580-404e-8880-13635bfc33a5 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:fa:7c:e4,bridge_name='br-int',has_traffic_filtering=True,id=4d7beeed-1a0b-490b-a788-3b8442f86758,network=Network(51cddd0f-0e4b-4d37-be40-ce5592263bc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d7beeed-1a') {{(pid=70975) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 18 16:21:33 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:33 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4d7beeed-1a, bridge=br-int, if_exists=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:21:33 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:33 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:21:33 user nova-compute[70975]: INFO os_vif [None req-ed1c6b26-2580-404e-8880-13635bfc33a5 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:fa:7c:e4,bridge_name='br-int',has_traffic_filtering=True,id=4d7beeed-1a0b-490b-a788-3b8442f86758,network=Network(51cddd0f-0e4b-4d37-be40-ce5592263bc2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d7beeed-1a') Apr 18 16:21:33 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-ed1c6b26-2580-404e-8880-13635bfc33a5 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Deleting instance files /opt/stack/data/nova/instances/f5496c5f-292e-4912-991b-f834009e51a1_del Apr 18 16:21:33 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-ed1c6b26-2580-404e-8880-13635bfc33a5 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Deletion of /opt/stack/data/nova/instances/f5496c5f-292e-4912-991b-f834009e51a1_del complete Apr 18 16:21:33 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:33 user nova-compute[70975]: INFO nova.compute.manager [None req-ed1c6b26-2580-404e-8880-13635bfc33a5 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Took 0.66 seconds to destroy the instance on the hypervisor. Apr 18 16:21:33 user nova-compute[70975]: DEBUG oslo.service.loopingcall [None req-ed1c6b26-2580-404e-8880-13635bfc33a5 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70975) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 18 16:21:33 user nova-compute[70975]: DEBUG nova.compute.manager [-] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Deallocating network for instance {{(pid=70975) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 18 16:21:33 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: f5496c5f-292e-4912-991b-f834009e51a1] deallocate_for_instance() {{(pid=70975) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 18 16:21:33 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Updating instance_info_cache with network_info: [] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:21:33 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Took 0.67 seconds to deallocate network for instance. Apr 18 16:21:33 user nova-compute[70975]: DEBUG nova.compute.manager [req-8b652470-ba81-4cfb-85ab-c4b646e5df5b req-146bc672-e1a4-484b-a0f3-d396bf3b31ce service nova] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Received event network-vif-deleted-4d7beeed-1a0b-490b-a788-3b8442f86758 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:21:33 user nova-compute[70975]: INFO nova.compute.manager [req-8b652470-ba81-4cfb-85ab-c4b646e5df5b req-146bc672-e1a4-484b-a0f3-d396bf3b31ce service nova] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Neutron deleted interface 4d7beeed-1a0b-490b-a788-3b8442f86758; detaching it from the instance and deleting it from the info cache Apr 18 16:21:33 user nova-compute[70975]: DEBUG nova.network.neutron [req-8b652470-ba81-4cfb-85ab-c4b646e5df5b req-146bc672-e1a4-484b-a0f3-d396bf3b31ce service nova] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Updating instance_info_cache with network_info: [] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:21:33 user nova-compute[70975]: DEBUG nova.compute.manager [req-8b652470-ba81-4cfb-85ab-c4b646e5df5b req-146bc672-e1a4-484b-a0f3-d396bf3b31ce service nova] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Detach interface failed, port_id=4d7beeed-1a0b-490b-a788-3b8442f86758, reason: Instance f5496c5f-292e-4912-991b-f834009e51a1 could not be found. {{(pid=70975) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 18 16:21:33 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ed1c6b26-2580-404e-8880-13635bfc33a5 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:21:33 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ed1c6b26-2580-404e-8880-13635bfc33a5 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:21:34 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-ed1c6b26-2580-404e-8880-13635bfc33a5 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:21:34 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-ed1c6b26-2580-404e-8880-13635bfc33a5 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:21:34 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ed1c6b26-2580-404e-8880-13635bfc33a5 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.256s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:21:34 user nova-compute[70975]: INFO nova.scheduler.client.report [None req-ed1c6b26-2580-404e-8880-13635bfc33a5 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Deleted allocations for instance f5496c5f-292e-4912-991b-f834009e51a1 Apr 18 16:21:34 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ed1c6b26-2580-404e-8880-13635bfc33a5 tempest-AttachVolumeShelveTestJSON-1663710151 tempest-AttachVolumeShelveTestJSON-1663710151-project-member] Lock "f5496c5f-292e-4912-991b-f834009e51a1" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.775s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:21:34 user nova-compute[70975]: DEBUG nova.compute.manager [req-39501554-6ec7-43c0-b17c-f3cc4b287718 req-039a2987-077c-4fd5-a91a-5df0fc07e610 service nova] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Received event network-vif-plugged-4d7beeed-1a0b-490b-a788-3b8442f86758 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:21:34 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-39501554-6ec7-43c0-b17c-f3cc4b287718 req-039a2987-077c-4fd5-a91a-5df0fc07e610 service nova] Acquiring lock "f5496c5f-292e-4912-991b-f834009e51a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:21:34 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-39501554-6ec7-43c0-b17c-f3cc4b287718 req-039a2987-077c-4fd5-a91a-5df0fc07e610 service nova] Lock "f5496c5f-292e-4912-991b-f834009e51a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:21:34 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-39501554-6ec7-43c0-b17c-f3cc4b287718 req-039a2987-077c-4fd5-a91a-5df0fc07e610 service nova] Lock "f5496c5f-292e-4912-991b-f834009e51a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:21:34 user nova-compute[70975]: DEBUG nova.compute.manager [req-39501554-6ec7-43c0-b17c-f3cc4b287718 req-039a2987-077c-4fd5-a91a-5df0fc07e610 service nova] [instance: f5496c5f-292e-4912-991b-f834009e51a1] No waiting events found dispatching network-vif-plugged-4d7beeed-1a0b-490b-a788-3b8442f86758 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:21:34 user nova-compute[70975]: WARNING nova.compute.manager [req-39501554-6ec7-43c0-b17c-f3cc4b287718 req-039a2987-077c-4fd5-a91a-5df0fc07e610 service nova] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Received unexpected event network-vif-plugged-4d7beeed-1a0b-490b-a788-3b8442f86758 for instance with vm_state deleted and task_state None. Apr 18 16:21:38 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:42 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:43 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:48 user nova-compute[70975]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:21:48 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: f5496c5f-292e-4912-991b-f834009e51a1] VM Stopped (Lifecycle Event) Apr 18 16:21:48 user nova-compute[70975]: DEBUG nova.compute.manager [None req-82442994-cb20-4c31-a209-67d91ce9a1cb None None] [instance: f5496c5f-292e-4912-991b-f834009e51a1] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:21:48 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:21:48 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:48 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=70975) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 18 16:21:48 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70975) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 18 16:21:48 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70975) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 18 16:21:48 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:50 user nova-compute[70975]: DEBUG nova.compute.manager [req-88c920f1-4fff-4193-b914-7c8a52e58876 req-6c37ad8b-00e2-4ead-8d86-7d99bcd693aa service nova] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Received event network-changed-fb818849-31a0-4c25-b42d-ca19fe250ca6 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:21:50 user nova-compute[70975]: DEBUG nova.compute.manager [req-88c920f1-4fff-4193-b914-7c8a52e58876 req-6c37ad8b-00e2-4ead-8d86-7d99bcd693aa service nova] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Refreshing instance network info cache due to event network-changed-fb818849-31a0-4c25-b42d-ca19fe250ca6. {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 18 16:21:50 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-88c920f1-4fff-4193-b914-7c8a52e58876 req-6c37ad8b-00e2-4ead-8d86-7d99bcd693aa service nova] Acquiring lock "refresh_cache-f6d6085d-9e15-4e29-ab90-3a8928971324" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:21:50 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-88c920f1-4fff-4193-b914-7c8a52e58876 req-6c37ad8b-00e2-4ead-8d86-7d99bcd693aa service nova] Acquired lock "refresh_cache-f6d6085d-9e15-4e29-ab90-3a8928971324" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:21:50 user nova-compute[70975]: DEBUG nova.network.neutron [req-88c920f1-4fff-4193-b914-7c8a52e58876 req-6c37ad8b-00e2-4ead-8d86-7d99bcd693aa service nova] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Refreshing network info cache for port fb818849-31a0-4c25-b42d-ca19fe250ca6 {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 18 16:21:50 user nova-compute[70975]: DEBUG nova.network.neutron [req-88c920f1-4fff-4193-b914-7c8a52e58876 req-6c37ad8b-00e2-4ead-8d86-7d99bcd693aa service nova] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Updated VIF entry in instance network info cache for port fb818849-31a0-4c25-b42d-ca19fe250ca6. {{(pid=70975) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 18 16:21:50 user nova-compute[70975]: DEBUG nova.network.neutron [req-88c920f1-4fff-4193-b914-7c8a52e58876 req-6c37ad8b-00e2-4ead-8d86-7d99bcd693aa service nova] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Updating instance_info_cache with network_info: [{"id": "fb818849-31a0-4c25-b42d-ca19fe250ca6", "address": "fa:16:3e:54:c5:ae", "network": {"id": "02aca424-2923-404b-9c66-76bec89f82b7", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1255258227-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.106", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6b4e8d8797be4c0e91b1401538f2eba8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb818849-31", "ovs_interfaceid": "fb818849-31a0-4c25-b42d-ca19fe250ca6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:21:50 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-88c920f1-4fff-4193-b914-7c8a52e58876 req-6c37ad8b-00e2-4ead-8d86-7d99bcd693aa service nova] Releasing lock "refresh_cache-f6d6085d-9e15-4e29-ab90-3a8928971324" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:21:50 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Acquiring lock "6aece7dd-d545-4e26-9cb7-30ee0b01ebb2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:21:50 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "6aece7dd-d545-4e26-9cb7-30ee0b01ebb2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:21:50 user nova-compute[70975]: DEBUG nova.compute.manager [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Starting instance... {{(pid=70975) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 18 16:21:50 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:21:50 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:21:50 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70975) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 18 16:21:50 user nova-compute[70975]: INFO nova.compute.claims [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Claim successful on node user Apr 18 16:21:51 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:21:51 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:21:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.416s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:21:51 user nova-compute[70975]: DEBUG nova.compute.manager [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Start building networks asynchronously for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 18 16:21:51 user nova-compute[70975]: DEBUG nova.compute.manager [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Allocating IP information in the background. {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 18 16:21:51 user nova-compute[70975]: DEBUG nova.network.neutron [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] allocate_for_instance() {{(pid=70975) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 18 16:21:51 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 18 16:21:51 user nova-compute[70975]: DEBUG nova.compute.manager [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Start building block device mappings for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 18 16:21:51 user nova-compute[70975]: DEBUG nova.policy [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c54c277689214bd0a2cadb1e2ac288a9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f516f5ec45ca4508841c77f79e8c038b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70975) authorize /opt/stack/nova/nova/policy.py:203}} Apr 18 16:21:51 user nova-compute[70975]: DEBUG nova.compute.manager [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Start spawning the instance on the hypervisor. {{(pid=70975) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 18 16:21:51 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Creating instance directory {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 18 16:21:51 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Creating image(s) Apr 18 16:21:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Acquiring lock "/opt/stack/data/nova/instances/6aece7dd-d545-4e26-9cb7-30ee0b01ebb2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:21:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "/opt/stack/data/nova/instances/6aece7dd-d545-4e26-9cb7-30ee0b01ebb2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:21:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "/opt/stack/data/nova/instances/6aece7dd-d545-4e26-9cb7-30ee0b01ebb2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:21:51 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:21:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-eb2ba928-b73a-44f9-9b4a-c941ca49aebb tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Acquiring lock "f6d6085d-9e15-4e29-ab90-3a8928971324" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:21:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-eb2ba928-b73a-44f9-9b4a-c941ca49aebb tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "f6d6085d-9e15-4e29-ab90-3a8928971324" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:21:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-eb2ba928-b73a-44f9-9b4a-c941ca49aebb tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Acquiring lock "f6d6085d-9e15-4e29-ab90-3a8928971324-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:21:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-eb2ba928-b73a-44f9-9b4a-c941ca49aebb tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "f6d6085d-9e15-4e29-ab90-3a8928971324-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.002s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:21:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-eb2ba928-b73a-44f9-9b4a-c941ca49aebb tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "f6d6085d-9e15-4e29-ab90-3a8928971324-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:21:51 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.149s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:21:51 user nova-compute[70975]: INFO nova.compute.manager [None req-eb2ba928-b73a-44f9-9b4a-c941ca49aebb tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Terminating instance Apr 18 16:21:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Acquiring lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:21:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:21:51 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:21:51 user nova-compute[70975]: DEBUG nova.compute.manager [None req-eb2ba928-b73a-44f9-9b4a-c941ca49aebb tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Start destroying the instance on the hypervisor. {{(pid=70975) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 18 16:21:51 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:51 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.141s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:21:51 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/6aece7dd-d545-4e26-9cb7-30ee0b01ebb2/disk 1073741824 {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:21:51 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:51 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/6aece7dd-d545-4e26-9cb7-30ee0b01ebb2/disk 1073741824" returned: 0 in 0.054s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:21:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.200s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:21:51 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:21:51 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.139s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:21:51 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Checking if we can resize image /opt/stack/data/nova/instances/6aece7dd-d545-4e26-9cb7-30ee0b01ebb2/disk. size=1073741824 {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 18 16:21:51 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6aece7dd-d545-4e26-9cb7-30ee0b01ebb2/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:21:51 user nova-compute[70975]: DEBUG nova.compute.manager [req-66b6b5ae-8506-40e8-bcad-98555ef5eb52 req-b54135de-9df0-49ae-a729-06a680bc8e18 service nova] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Received event network-vif-unplugged-fb818849-31a0-4c25-b42d-ca19fe250ca6 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:21:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-66b6b5ae-8506-40e8-bcad-98555ef5eb52 req-b54135de-9df0-49ae-a729-06a680bc8e18 service nova] Acquiring lock "f6d6085d-9e15-4e29-ab90-3a8928971324-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:21:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-66b6b5ae-8506-40e8-bcad-98555ef5eb52 req-b54135de-9df0-49ae-a729-06a680bc8e18 service nova] Lock "f6d6085d-9e15-4e29-ab90-3a8928971324-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:21:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-66b6b5ae-8506-40e8-bcad-98555ef5eb52 req-b54135de-9df0-49ae-a729-06a680bc8e18 service nova] Lock "f6d6085d-9e15-4e29-ab90-3a8928971324-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:21:51 user nova-compute[70975]: DEBUG nova.compute.manager [req-66b6b5ae-8506-40e8-bcad-98555ef5eb52 req-b54135de-9df0-49ae-a729-06a680bc8e18 service nova] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] No waiting events found dispatching network-vif-unplugged-fb818849-31a0-4c25-b42d-ca19fe250ca6 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:21:51 user nova-compute[70975]: DEBUG nova.compute.manager [req-66b6b5ae-8506-40e8-bcad-98555ef5eb52 req-b54135de-9df0-49ae-a729-06a680bc8e18 service nova] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Received event network-vif-unplugged-fb818849-31a0-4c25-b42d-ca19fe250ca6 for instance with task_state deleting. {{(pid=70975) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 18 16:21:52 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6aece7dd-d545-4e26-9cb7-30ee0b01ebb2/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:21:52 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Cannot resize image /opt/stack/data/nova/instances/6aece7dd-d545-4e26-9cb7-30ee0b01ebb2/disk to a smaller size. {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 18 16:21:52 user nova-compute[70975]: DEBUG nova.objects.instance [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lazy-loading 'migration_context' on Instance uuid 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2 {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:21:52 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Created local disks {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 18 16:21:52 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Ensure instance console log exists: /opt/stack/data/nova/instances/6aece7dd-d545-4e26-9cb7-30ee0b01ebb2/console.log {{(pid=70975) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 18 16:21:52 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:21:52 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:21:52 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:21:52 user nova-compute[70975]: DEBUG nova.network.neutron [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Successfully created port: 8d71cc71-9d8c-428d-ad04-69a31a967fe9 {{(pid=70975) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 18 16:21:52 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Instance destroyed successfully. Apr 18 16:21:52 user nova-compute[70975]: DEBUG nova.objects.instance [None req-eb2ba928-b73a-44f9-9b4a-c941ca49aebb tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lazy-loading 'resources' on Instance uuid f6d6085d-9e15-4e29-ab90-3a8928971324 {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:21:52 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-eb2ba928-b73a-44f9-9b4a-c941ca49aebb tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:19:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-2112657845',display_name='tempest-AttachVolumeNegativeTest-server-2112657845',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-2112657845',id=16,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHYRXU/ibSPY+lfyweoe12uOrvfmUvG6DlTq9LgRSH5Mu+rZpmKAfw8UVQNbDlibCQU69kF6sfr+Z42hzsCh/sT3mzfLZiHHLTZ94at32kiiHcYOGoL6apTKhxzZMUuP2A==',key_name='tempest-keypair-901960884',keypairs=,launch_index=0,launched_at=2023-04-18T16:20:07Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='6b4e8d8797be4c0e91b1401538f2eba8',ramdisk_id='',reservation_id='r-bu0v0mk3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeNegativeTest-216357456',owner_user_name='tempest-AttachVolumeNegativeTest-216357456-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-18T16:20:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='af90e17ec027463fa8793e8539c39e13',uuid=f6d6085d-9e15-4e29-ab90-3a8928971324,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fb818849-31a0-4c25-b42d-ca19fe250ca6", "address": "fa:16:3e:54:c5:ae", "network": {"id": "02aca424-2923-404b-9c66-76bec89f82b7", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1255258227-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.106", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6b4e8d8797be4c0e91b1401538f2eba8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb818849-31", "ovs_interfaceid": "fb818849-31a0-4c25-b42d-ca19fe250ca6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 18 16:21:52 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-eb2ba928-b73a-44f9-9b4a-c941ca49aebb tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Converting VIF {"id": "fb818849-31a0-4c25-b42d-ca19fe250ca6", "address": "fa:16:3e:54:c5:ae", "network": {"id": "02aca424-2923-404b-9c66-76bec89f82b7", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1255258227-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.106", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "6b4e8d8797be4c0e91b1401538f2eba8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb818849-31", "ovs_interfaceid": "fb818849-31a0-4c25-b42d-ca19fe250ca6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:21:52 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-eb2ba928-b73a-44f9-9b4a-c941ca49aebb tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:54:c5:ae,bridge_name='br-int',has_traffic_filtering=True,id=fb818849-31a0-4c25-b42d-ca19fe250ca6,network=Network(02aca424-2923-404b-9c66-76bec89f82b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb818849-31') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:21:52 user nova-compute[70975]: DEBUG os_vif [None req-eb2ba928-b73a-44f9-9b4a-c941ca49aebb tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:54:c5:ae,bridge_name='br-int',has_traffic_filtering=True,id=fb818849-31a0-4c25-b42d-ca19fe250ca6,network=Network(02aca424-2923-404b-9c66-76bec89f82b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb818849-31') {{(pid=70975) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 18 16:21:52 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:52 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfb818849-31, bridge=br-int, if_exists=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:21:52 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:21:52 user nova-compute[70975]: INFO os_vif [None req-eb2ba928-b73a-44f9-9b4a-c941ca49aebb tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:54:c5:ae,bridge_name='br-int',has_traffic_filtering=True,id=fb818849-31a0-4c25-b42d-ca19fe250ca6,network=Network(02aca424-2923-404b-9c66-76bec89f82b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb818849-31') Apr 18 16:21:52 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-eb2ba928-b73a-44f9-9b4a-c941ca49aebb tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Deleting instance files /opt/stack/data/nova/instances/f6d6085d-9e15-4e29-ab90-3a8928971324_del Apr 18 16:21:52 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-eb2ba928-b73a-44f9-9b4a-c941ca49aebb tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Deletion of /opt/stack/data/nova/instances/f6d6085d-9e15-4e29-ab90-3a8928971324_del complete Apr 18 16:21:52 user nova-compute[70975]: INFO nova.compute.manager [None req-eb2ba928-b73a-44f9-9b4a-c941ca49aebb tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Took 0.86 seconds to destroy the instance on the hypervisor. Apr 18 16:21:52 user nova-compute[70975]: DEBUG oslo.service.loopingcall [None req-eb2ba928-b73a-44f9-9b4a-c941ca49aebb tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70975) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 18 16:21:52 user nova-compute[70975]: DEBUG nova.compute.manager [-] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Deallocating network for instance {{(pid=70975) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 18 16:21:52 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] deallocate_for_instance() {{(pid=70975) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 18 16:21:52 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:52 user nova-compute[70975]: DEBUG nova.network.neutron [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Successfully updated port: 8d71cc71-9d8c-428d-ad04-69a31a967fe9 {{(pid=70975) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 18 16:21:52 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Acquiring lock "refresh_cache-6aece7dd-d545-4e26-9cb7-30ee0b01ebb2" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:21:52 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Acquired lock "refresh_cache-6aece7dd-d545-4e26-9cb7-30ee0b01ebb2" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:21:52 user nova-compute[70975]: DEBUG nova.network.neutron [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Building network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 18 16:21:52 user nova-compute[70975]: DEBUG nova.compute.manager [req-4fafaea9-b3b3-409a-8b78-852e45661979 req-4f785640-9116-4260-a027-b05adbd554d6 service nova] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Received event network-changed-8d71cc71-9d8c-428d-ad04-69a31a967fe9 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:21:52 user nova-compute[70975]: DEBUG nova.compute.manager [req-4fafaea9-b3b3-409a-8b78-852e45661979 req-4f785640-9116-4260-a027-b05adbd554d6 service nova] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Refreshing instance network info cache due to event network-changed-8d71cc71-9d8c-428d-ad04-69a31a967fe9. {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 18 16:21:52 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-4fafaea9-b3b3-409a-8b78-852e45661979 req-4f785640-9116-4260-a027-b05adbd554d6 service nova] Acquiring lock "refresh_cache-6aece7dd-d545-4e26-9cb7-30ee0b01ebb2" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:21:52 user nova-compute[70975]: DEBUG nova.network.neutron [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Instance cache missing network info. {{(pid=70975) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG nova.network.neutron [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Updating instance_info_cache with network_info: [{"id": "8d71cc71-9d8c-428d-ad04-69a31a967fe9", "address": "fa:16:3e:f6:64:6c", "network": {"id": "923d10dc-c67e-4426-9c6e-856e903e2446", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-131274818-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f516f5ec45ca4508841c77f79e8c038b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d71cc71-9d", "ovs_interfaceid": "8d71cc71-9d8c-428d-ad04-69a31a967fe9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Releasing lock "refresh_cache-6aece7dd-d545-4e26-9cb7-30ee0b01ebb2" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG nova.compute.manager [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Instance network_info: |[{"id": "8d71cc71-9d8c-428d-ad04-69a31a967fe9", "address": "fa:16:3e:f6:64:6c", "network": {"id": "923d10dc-c67e-4426-9c6e-856e903e2446", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-131274818-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f516f5ec45ca4508841c77f79e8c038b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d71cc71-9d", "ovs_interfaceid": "8d71cc71-9d8c-428d-ad04-69a31a967fe9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-4fafaea9-b3b3-409a-8b78-852e45661979 req-4f785640-9116-4260-a027-b05adbd554d6 service nova] Acquired lock "refresh_cache-6aece7dd-d545-4e26-9cb7-30ee0b01ebb2" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG nova.network.neutron [req-4fafaea9-b3b3-409a-8b78-852e45661979 req-4f785640-9116-4260-a027-b05adbd554d6 service nova] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Refreshing network info cache for port 8d71cc71-9d8c-428d-ad04-69a31a967fe9 {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Start _get_guest_xml network_info=[{"id": "8d71cc71-9d8c-428d-ad04-69a31a967fe9", "address": "fa:16:3e:f6:64:6c", "network": {"id": "923d10dc-c67e-4426-9c6e-856e903e2446", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-131274818-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f516f5ec45ca4508841c77f79e8c038b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d71cc71-9d", "ovs_interfaceid": "8d71cc71-9d8c-428d-ad04-69a31a967fe9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encrypted': False, 'device_type': 'disk', 'encryption_secret_uuid': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'b11a20de-f82a-4158-b53e-0a0c7a1552cb'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 18 16:21:53 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:21:53 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:21:53 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70975) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-18T16:11:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=), allow threads: True {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Flavor limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Image limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Flavor pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Image pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Got 1 possible topologies {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:21:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1835828907',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1835828907',id=21,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f516f5ec45ca4508841c77f79e8c038b',ramdisk_id='',reservation_id='r-jkqz8m09',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-2021464272',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:21:51Z,user_data=None,user_id='c54c277689214bd0a2cadb1e2ac288a9',uuid=6aece7dd-d545-4e26-9cb7-30ee0b01ebb2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8d71cc71-9d8c-428d-ad04-69a31a967fe9", "address": "fa:16:3e:f6:64:6c", "network": {"id": "923d10dc-c67e-4426-9c6e-856e903e2446", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-131274818-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f516f5ec45ca4508841c77f79e8c038b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d71cc71-9d", "ovs_interfaceid": "8d71cc71-9d8c-428d-ad04-69a31a967fe9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70975) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Converting VIF {"id": "8d71cc71-9d8c-428d-ad04-69a31a967fe9", "address": "fa:16:3e:f6:64:6c", "network": {"id": "923d10dc-c67e-4426-9c6e-856e903e2446", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-131274818-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f516f5ec45ca4508841c77f79e8c038b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d71cc71-9d", "ovs_interfaceid": "8d71cc71-9d8c-428d-ad04-69a31a967fe9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f6:64:6c,bridge_name='br-int',has_traffic_filtering=True,id=8d71cc71-9d8c-428d-ad04-69a31a967fe9,network=Network(923d10dc-c67e-4426-9c6e-856e903e2446),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8d71cc71-9d') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG nova.objects.instance [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lazy-loading 'pci_devices' on Instance uuid 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2 {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] End _get_guest_xml xml= Apr 18 16:21:53 user nova-compute[70975]: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2 Apr 18 16:21:53 user nova-compute[70975]: instance-00000015 Apr 18 16:21:53 user nova-compute[70975]: 131072 Apr 18 16:21:53 user nova-compute[70975]: 1 Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: tempest-ServerBootFromVolumeStableRescueTest-server-1835828907 Apr 18 16:21:53 user nova-compute[70975]: 2023-04-18 16:21:53 Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: 128 Apr 18 16:21:53 user nova-compute[70975]: 1 Apr 18 16:21:53 user nova-compute[70975]: 0 Apr 18 16:21:53 user nova-compute[70975]: 0 Apr 18 16:21:53 user nova-compute[70975]: 1 Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member Apr 18 16:21:53 user nova-compute[70975]: tempest-ServerBootFromVolumeStableRescueTest-2021464272 Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: OpenStack Foundation Apr 18 16:21:53 user nova-compute[70975]: OpenStack Nova Apr 18 16:21:53 user nova-compute[70975]: 0.0.0 Apr 18 16:21:53 user nova-compute[70975]: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2 Apr 18 16:21:53 user nova-compute[70975]: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2 Apr 18 16:21:53 user nova-compute[70975]: Virtual Machine Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: hvm Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Nehalem Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: /dev/urandom Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: Apr 18 16:21:53 user nova-compute[70975]: {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:21:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1835828907',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1835828907',id=21,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f516f5ec45ca4508841c77f79e8c038b',ramdisk_id='',reservation_id='r-jkqz8m09',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-2021464272',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:21:51Z,user_data=None,user_id='c54c277689214bd0a2cadb1e2ac288a9',uuid=6aece7dd-d545-4e26-9cb7-30ee0b01ebb2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8d71cc71-9d8c-428d-ad04-69a31a967fe9", "address": "fa:16:3e:f6:64:6c", "network": {"id": "923d10dc-c67e-4426-9c6e-856e903e2446", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-131274818-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f516f5ec45ca4508841c77f79e8c038b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d71cc71-9d", "ovs_interfaceid": "8d71cc71-9d8c-428d-ad04-69a31a967fe9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Converting VIF {"id": "8d71cc71-9d8c-428d-ad04-69a31a967fe9", "address": "fa:16:3e:f6:64:6c", "network": {"id": "923d10dc-c67e-4426-9c6e-856e903e2446", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-131274818-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f516f5ec45ca4508841c77f79e8c038b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d71cc71-9d", "ovs_interfaceid": "8d71cc71-9d8c-428d-ad04-69a31a967fe9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f6:64:6c,bridge_name='br-int',has_traffic_filtering=True,id=8d71cc71-9d8c-428d-ad04-69a31a967fe9,network=Network(923d10dc-c67e-4426-9c6e-856e903e2446),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8d71cc71-9d') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG os_vif [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:64:6c,bridge_name='br-int',has_traffic_filtering=True,id=8d71cc71-9d8c-428d-ad04-69a31a967fe9,network=Network(923d10dc-c67e-4426-9c6e-856e903e2446),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8d71cc71-9d') {{(pid=70975) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8d71cc71-9d, may_exist=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8d71cc71-9d, col_values=(('external_ids', {'iface-id': '8d71cc71-9d8c-428d-ad04-69a31a967fe9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f6:64:6c', 'vm-uuid': '6aece7dd-d545-4e26-9cb7-30ee0b01ebb2'}),)) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:53 user nova-compute[70975]: INFO os_vif [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:64:6c,bridge_name='br-int',has_traffic_filtering=True,id=8d71cc71-9d8c-428d-ad04-69a31a967fe9,network=Network(923d10dc-c67e-4426-9c6e-856e903e2446),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8d71cc71-9d') Apr 18 16:21:53 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] No BDM found with device name vda, not building metadata. {{(pid=70975) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] No VIF found with MAC fa:16:3e:f6:64:6c, not building metadata {{(pid=70975) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Updating instance_info_cache with network_info: [] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:21:53 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Took 0.97 seconds to deallocate network for instance. Apr 18 16:21:53 user nova-compute[70975]: DEBUG nova.compute.manager [req-9f2e2782-65a2-4384-9e74-2ca99f28836f req-2a9b1633-098d-47b7-a9b0-7e6fc638716a service nova] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Received event network-vif-deleted-fb818849-31a0-4c25-b42d-ca19fe250ca6 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-eb2ba928-b73a-44f9-9b4a-c941ca49aebb tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-eb2ba928-b73a-44f9-9b4a-c941ca49aebb tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG nova.network.neutron [req-4fafaea9-b3b3-409a-8b78-852e45661979 req-4f785640-9116-4260-a027-b05adbd554d6 service nova] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Updated VIF entry in instance network info cache for port 8d71cc71-9d8c-428d-ad04-69a31a967fe9. {{(pid=70975) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG nova.network.neutron [req-4fafaea9-b3b3-409a-8b78-852e45661979 req-4f785640-9116-4260-a027-b05adbd554d6 service nova] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Updating instance_info_cache with network_info: [{"id": "8d71cc71-9d8c-428d-ad04-69a31a967fe9", "address": "fa:16:3e:f6:64:6c", "network": {"id": "923d10dc-c67e-4426-9c6e-856e903e2446", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-131274818-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f516f5ec45ca4508841c77f79e8c038b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d71cc71-9d", "ovs_interfaceid": "8d71cc71-9d8c-428d-ad04-69a31a967fe9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-4fafaea9-b3b3-409a-8b78-852e45661979 req-4f785640-9116-4260-a027-b05adbd554d6 service nova] Releasing lock "refresh_cache-6aece7dd-d545-4e26-9cb7-30ee0b01ebb2" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-eb2ba928-b73a-44f9-9b4a-c941ca49aebb tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-eb2ba928-b73a-44f9-9b4a-c941ca49aebb tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-eb2ba928-b73a-44f9-9b4a-c941ca49aebb tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.294s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:21:53 user nova-compute[70975]: INFO nova.scheduler.client.report [None req-eb2ba928-b73a-44f9-9b4a-c941ca49aebb tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Deleted allocations for instance f6d6085d-9e15-4e29-ab90-3a8928971324 Apr 18 16:21:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-eb2ba928-b73a-44f9-9b4a-c941ca49aebb tempest-AttachVolumeNegativeTest-216357456 tempest-AttachVolumeNegativeTest-216357456-project-member] Lock "f6d6085d-9e15-4e29-ab90-3a8928971324" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.344s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG nova.compute.manager [req-192365a9-feff-4073-ab6c-5fd1ce858338 req-8b8928bb-e17e-42aa-99e3-52a1b493d3a0 service nova] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Received event network-vif-plugged-fb818849-31a0-4c25-b42d-ca19fe250ca6 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-192365a9-feff-4073-ab6c-5fd1ce858338 req-8b8928bb-e17e-42aa-99e3-52a1b493d3a0 service nova] Acquiring lock "f6d6085d-9e15-4e29-ab90-3a8928971324-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-192365a9-feff-4073-ab6c-5fd1ce858338 req-8b8928bb-e17e-42aa-99e3-52a1b493d3a0 service nova] Lock "f6d6085d-9e15-4e29-ab90-3a8928971324-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-192365a9-feff-4073-ab6c-5fd1ce858338 req-8b8928bb-e17e-42aa-99e3-52a1b493d3a0 service nova] Lock "f6d6085d-9e15-4e29-ab90-3a8928971324-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:21:53 user nova-compute[70975]: DEBUG nova.compute.manager [req-192365a9-feff-4073-ab6c-5fd1ce858338 req-8b8928bb-e17e-42aa-99e3-52a1b493d3a0 service nova] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] No waiting events found dispatching network-vif-plugged-fb818849-31a0-4c25-b42d-ca19fe250ca6 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:21:54 user nova-compute[70975]: WARNING nova.compute.manager [req-192365a9-feff-4073-ab6c-5fd1ce858338 req-8b8928bb-e17e-42aa-99e3-52a1b493d3a0 service nova] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Received unexpected event network-vif-plugged-fb818849-31a0-4c25-b42d-ca19fe250ca6 for instance with vm_state deleted and task_state None. Apr 18 16:21:54 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:54 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:54 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:54 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:54 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:54 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:55 user nova-compute[70975]: DEBUG nova.compute.manager [req-498c15ec-3045-4cda-a2ab-6b28e2aaa166 req-815485a7-9cc9-4013-b152-9c96f46a81e7 service nova] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Received event network-vif-plugged-8d71cc71-9d8c-428d-ad04-69a31a967fe9 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:21:55 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-498c15ec-3045-4cda-a2ab-6b28e2aaa166 req-815485a7-9cc9-4013-b152-9c96f46a81e7 service nova] Acquiring lock "6aece7dd-d545-4e26-9cb7-30ee0b01ebb2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:21:55 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-498c15ec-3045-4cda-a2ab-6b28e2aaa166 req-815485a7-9cc9-4013-b152-9c96f46a81e7 service nova] Lock "6aece7dd-d545-4e26-9cb7-30ee0b01ebb2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:21:55 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-498c15ec-3045-4cda-a2ab-6b28e2aaa166 req-815485a7-9cc9-4013-b152-9c96f46a81e7 service nova] Lock "6aece7dd-d545-4e26-9cb7-30ee0b01ebb2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:21:55 user nova-compute[70975]: DEBUG nova.compute.manager [req-498c15ec-3045-4cda-a2ab-6b28e2aaa166 req-815485a7-9cc9-4013-b152-9c96f46a81e7 service nova] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] No waiting events found dispatching network-vif-plugged-8d71cc71-9d8c-428d-ad04-69a31a967fe9 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:21:55 user nova-compute[70975]: WARNING nova.compute.manager [req-498c15ec-3045-4cda-a2ab-6b28e2aaa166 req-815485a7-9cc9-4013-b152-9c96f46a81e7 service nova] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Received unexpected event network-vif-plugged-8d71cc71-9d8c-428d-ad04-69a31a967fe9 for instance with vm_state building and task_state spawning. Apr 18 16:21:55 user nova-compute[70975]: DEBUG nova.compute.manager [req-498c15ec-3045-4cda-a2ab-6b28e2aaa166 req-815485a7-9cc9-4013-b152-9c96f46a81e7 service nova] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Received event network-vif-plugged-8d71cc71-9d8c-428d-ad04-69a31a967fe9 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:21:55 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-498c15ec-3045-4cda-a2ab-6b28e2aaa166 req-815485a7-9cc9-4013-b152-9c96f46a81e7 service nova] Acquiring lock "6aece7dd-d545-4e26-9cb7-30ee0b01ebb2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:21:55 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-498c15ec-3045-4cda-a2ab-6b28e2aaa166 req-815485a7-9cc9-4013-b152-9c96f46a81e7 service nova] Lock "6aece7dd-d545-4e26-9cb7-30ee0b01ebb2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:21:55 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-498c15ec-3045-4cda-a2ab-6b28e2aaa166 req-815485a7-9cc9-4013-b152-9c96f46a81e7 service nova] Lock "6aece7dd-d545-4e26-9cb7-30ee0b01ebb2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:21:55 user nova-compute[70975]: DEBUG nova.compute.manager [req-498c15ec-3045-4cda-a2ab-6b28e2aaa166 req-815485a7-9cc9-4013-b152-9c96f46a81e7 service nova] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] No waiting events found dispatching network-vif-plugged-8d71cc71-9d8c-428d-ad04-69a31a967fe9 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:21:55 user nova-compute[70975]: WARNING nova.compute.manager [req-498c15ec-3045-4cda-a2ab-6b28e2aaa166 req-815485a7-9cc9-4013-b152-9c96f46a81e7 service nova] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Received unexpected event network-vif-plugged-8d71cc71-9d8c-428d-ad04-69a31a967fe9 for instance with vm_state building and task_state spawning. Apr 18 16:21:56 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Resumed> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:21:56 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] VM Resumed (Lifecycle Event) Apr 18 16:21:56 user nova-compute[70975]: DEBUG nova.compute.manager [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Instance event wait completed in 0 seconds for {{(pid=70975) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 18 16:21:56 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Guest created on hypervisor {{(pid=70975) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 18 16:21:56 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Instance spawned successfully. Apr 18 16:21:56 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 18 16:21:56 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:21:56 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:21:56 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Found default for hw_cdrom_bus of ide {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:21:56 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Found default for hw_disk_bus of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:21:56 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Found default for hw_input_bus of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:21:56 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Found default for hw_pointer_model of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:21:56 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Found default for hw_video_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:21:56 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Found default for hw_vif_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:21:56 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:21:56 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Started> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:21:56 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] VM Started (Lifecycle Event) Apr 18 16:21:56 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:21:56 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:21:56 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:21:56 user nova-compute[70975]: INFO nova.compute.manager [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Took 5.47 seconds to spawn the instance on the hypervisor. Apr 18 16:21:56 user nova-compute[70975]: DEBUG nova.compute.manager [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:21:56 user nova-compute[70975]: INFO nova.compute.manager [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Took 6.18 seconds to build instance. Apr 18 16:21:56 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-e80ddbb0-b3b5-44d7-9369-ae66b9123b9d tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "6aece7dd-d545-4e26-9cb7-30ee0b01ebb2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.267s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:21:57 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:58 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:58 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-f5c5c782-2268-4c71-b629-64042e43e389 tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Acquiring lock "5f4e6f9b-5413-4399-83ca-9bc78911db38" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:21:58 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-f5c5c782-2268-4c71-b629-64042e43e389 tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Lock "5f4e6f9b-5413-4399-83ca-9bc78911db38" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:21:58 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-f5c5c782-2268-4c71-b629-64042e43e389 tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Acquiring lock "5f4e6f9b-5413-4399-83ca-9bc78911db38-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:21:58 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-f5c5c782-2268-4c71-b629-64042e43e389 tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Lock "5f4e6f9b-5413-4399-83ca-9bc78911db38-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:21:58 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-f5c5c782-2268-4c71-b629-64042e43e389 tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Lock "5f4e6f9b-5413-4399-83ca-9bc78911db38-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:21:58 user nova-compute[70975]: INFO nova.compute.manager [None req-f5c5c782-2268-4c71-b629-64042e43e389 tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Terminating instance Apr 18 16:21:58 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f5c5c782-2268-4c71-b629-64042e43e389 tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Start destroying the instance on the hypervisor. {{(pid=70975) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 18 16:21:59 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:59 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:59 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:59 user nova-compute[70975]: DEBUG nova.compute.manager [req-fb3130b6-0f39-48b9-a019-d7a086f28fa4 req-13e49141-1489-4f0b-978f-6500ec1c9d0e service nova] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Received event network-vif-unplugged-4330b5f4-c990-4a7d-aa5f-f95315fddf78 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:21:59 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-fb3130b6-0f39-48b9-a019-d7a086f28fa4 req-13e49141-1489-4f0b-978f-6500ec1c9d0e service nova] Acquiring lock "5f4e6f9b-5413-4399-83ca-9bc78911db38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:21:59 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-fb3130b6-0f39-48b9-a019-d7a086f28fa4 req-13e49141-1489-4f0b-978f-6500ec1c9d0e service nova] Lock "5f4e6f9b-5413-4399-83ca-9bc78911db38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:21:59 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-fb3130b6-0f39-48b9-a019-d7a086f28fa4 req-13e49141-1489-4f0b-978f-6500ec1c9d0e service nova] Lock "5f4e6f9b-5413-4399-83ca-9bc78911db38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:21:59 user nova-compute[70975]: DEBUG nova.compute.manager [req-fb3130b6-0f39-48b9-a019-d7a086f28fa4 req-13e49141-1489-4f0b-978f-6500ec1c9d0e service nova] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] No waiting events found dispatching network-vif-unplugged-4330b5f4-c990-4a7d-aa5f-f95315fddf78 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:21:59 user nova-compute[70975]: DEBUG nova.compute.manager [req-fb3130b6-0f39-48b9-a019-d7a086f28fa4 req-13e49141-1489-4f0b-978f-6500ec1c9d0e service nova] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Received event network-vif-unplugged-4330b5f4-c990-4a7d-aa5f-f95315fddf78 for instance with task_state deleting. {{(pid=70975) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 18 16:21:59 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Instance destroyed successfully. Apr 18 16:21:59 user nova-compute[70975]: DEBUG nova.objects.instance [None req-f5c5c782-2268-4c71-b629-64042e43e389 tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Lazy-loading 'resources' on Instance uuid 5f4e6f9b-5413-4399-83ca-9bc78911db38 {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:21:59 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-f5c5c782-2268-4c71-b629-64042e43e389 tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:20:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-SnapshotDataIntegrityTests-server-751902854',display_name='tempest-SnapshotDataIntegrityTests-server-751902854',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-snapshotdataintegritytests-server-751902854',id=17,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEjB7c1t/30/n9+kA02kID4xCptJhD3sPRt7RADFNBND5ODgA1T9Tp0ttLsMngsvRYW4lRZy7x7r7VmJz6i3Dd8HWEtRffPcfiU0xC7owaaIepRkCkRUzrPyKKptrcjx0A==',key_name='tempest-SnapshotDataIntegrityTests-898730089',keypairs=,launch_index=0,launched_at=2023-04-18T16:20:13Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='000fb9c948224fe3b595882d36cfb859',ramdisk_id='',reservation_id='r-1fmw5k8n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-SnapshotDataIntegrityTests-155672580',owner_user_name='tempest-SnapshotDataIntegrityTests-155672580-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-18T16:20:13Z,user_data=None,user_id='b01da22d0c8c4ee580567646e279d7b9',uuid=5f4e6f9b-5413-4399-83ca-9bc78911db38,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4330b5f4-c990-4a7d-aa5f-f95315fddf78", "address": "fa:16:3e:6f:03:b0", "network": {"id": "c17e54e4-8788-475c-b342-65dddd71342f", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-2093298384-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "000fb9c948224fe3b595882d36cfb859", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4330b5f4-c9", "ovs_interfaceid": "4330b5f4-c990-4a7d-aa5f-f95315fddf78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 18 16:21:59 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-f5c5c782-2268-4c71-b629-64042e43e389 tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Converting VIF {"id": "4330b5f4-c990-4a7d-aa5f-f95315fddf78", "address": "fa:16:3e:6f:03:b0", "network": {"id": "c17e54e4-8788-475c-b342-65dddd71342f", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-2093298384-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "000fb9c948224fe3b595882d36cfb859", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4330b5f4-c9", "ovs_interfaceid": "4330b5f4-c990-4a7d-aa5f-f95315fddf78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:21:59 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-f5c5c782-2268-4c71-b629-64042e43e389 tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6f:03:b0,bridge_name='br-int',has_traffic_filtering=True,id=4330b5f4-c990-4a7d-aa5f-f95315fddf78,network=Network(c17e54e4-8788-475c-b342-65dddd71342f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4330b5f4-c9') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:21:59 user nova-compute[70975]: DEBUG os_vif [None req-f5c5c782-2268-4c71-b629-64042e43e389 tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6f:03:b0,bridge_name='br-int',has_traffic_filtering=True,id=4330b5f4-c990-4a7d-aa5f-f95315fddf78,network=Network(c17e54e4-8788-475c-b342-65dddd71342f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4330b5f4-c9') {{(pid=70975) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 18 16:21:59 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:59 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4330b5f4-c9, bridge=br-int, if_exists=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:21:59 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:21:59 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:21:59 user nova-compute[70975]: INFO os_vif [None req-f5c5c782-2268-4c71-b629-64042e43e389 tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6f:03:b0,bridge_name='br-int',has_traffic_filtering=True,id=4330b5f4-c990-4a7d-aa5f-f95315fddf78,network=Network(c17e54e4-8788-475c-b342-65dddd71342f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4330b5f4-c9') Apr 18 16:21:59 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-f5c5c782-2268-4c71-b629-64042e43e389 tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Deleting instance files /opt/stack/data/nova/instances/5f4e6f9b-5413-4399-83ca-9bc78911db38_del Apr 18 16:21:59 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-f5c5c782-2268-4c71-b629-64042e43e389 tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Deletion of /opt/stack/data/nova/instances/5f4e6f9b-5413-4399-83ca-9bc78911db38_del complete Apr 18 16:21:59 user nova-compute[70975]: INFO nova.compute.manager [None req-f5c5c782-2268-4c71-b629-64042e43e389 tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Took 0.70 seconds to destroy the instance on the hypervisor. Apr 18 16:21:59 user nova-compute[70975]: DEBUG oslo.service.loopingcall [None req-f5c5c782-2268-4c71-b629-64042e43e389 tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70975) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 18 16:21:59 user nova-compute[70975]: DEBUG nova.compute.manager [-] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Deallocating network for instance {{(pid=70975) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 18 16:21:59 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] deallocate_for_instance() {{(pid=70975) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 18 16:22:00 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Updating instance_info_cache with network_info: [] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:22:00 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Took 0.48 seconds to deallocate network for instance. Apr 18 16:22:00 user nova-compute[70975]: DEBUG nova.compute.manager [req-2f602ab9-3c46-4150-99e5-bceb2e533fd5 req-44b4502c-dc69-426d-a5f5-72cf183f27dd service nova] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Received event network-vif-deleted-4330b5f4-c990-4a7d-aa5f-f95315fddf78 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:22:00 user nova-compute[70975]: INFO nova.compute.manager [req-2f602ab9-3c46-4150-99e5-bceb2e533fd5 req-44b4502c-dc69-426d-a5f5-72cf183f27dd service nova] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Neutron deleted interface 4330b5f4-c990-4a7d-aa5f-f95315fddf78; detaching it from the instance and deleting it from the info cache Apr 18 16:22:00 user nova-compute[70975]: DEBUG nova.network.neutron [req-2f602ab9-3c46-4150-99e5-bceb2e533fd5 req-44b4502c-dc69-426d-a5f5-72cf183f27dd service nova] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Updating instance_info_cache with network_info: [] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:22:00 user nova-compute[70975]: DEBUG nova.compute.manager [req-2f602ab9-3c46-4150-99e5-bceb2e533fd5 req-44b4502c-dc69-426d-a5f5-72cf183f27dd service nova] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Detach interface failed, port_id=4330b5f4-c990-4a7d-aa5f-f95315fddf78, reason: Instance 5f4e6f9b-5413-4399-83ca-9bc78911db38 could not be found. {{(pid=70975) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 18 16:22:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-f5c5c782-2268-4c71-b629-64042e43e389 tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:22:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-f5c5c782-2268-4c71-b629-64042e43e389 tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:22:00 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-f5c5c782-2268-4c71-b629-64042e43e389 tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:22:00 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-f5c5c782-2268-4c71-b629-64042e43e389 tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:22:00 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:22:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-f5c5c782-2268-4c71-b629-64042e43e389 tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.226s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:22:00 user nova-compute[70975]: INFO nova.scheduler.client.report [None req-f5c5c782-2268-4c71-b629-64042e43e389 tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Deleted allocations for instance 5f4e6f9b-5413-4399-83ca-9bc78911db38 Apr 18 16:22:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-f5c5c782-2268-4c71-b629-64042e43e389 tempest-SnapshotDataIntegrityTests-155672580 tempest-SnapshotDataIntegrityTests-155672580-project-member] Lock "5f4e6f9b-5413-4399-83ca-9bc78911db38" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.593s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:22:01 user nova-compute[70975]: DEBUG nova.compute.manager [req-d95c7fc2-0fe8-48dd-9baa-507fd274f82f req-21b144d8-f318-4dc0-9607-ffa29944c45f service nova] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Received event network-vif-plugged-4330b5f4-c990-4a7d-aa5f-f95315fddf78 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:22:01 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-d95c7fc2-0fe8-48dd-9baa-507fd274f82f req-21b144d8-f318-4dc0-9607-ffa29944c45f service nova] Acquiring lock "5f4e6f9b-5413-4399-83ca-9bc78911db38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:22:01 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-d95c7fc2-0fe8-48dd-9baa-507fd274f82f req-21b144d8-f318-4dc0-9607-ffa29944c45f service nova] Lock "5f4e6f9b-5413-4399-83ca-9bc78911db38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:22:01 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-d95c7fc2-0fe8-48dd-9baa-507fd274f82f req-21b144d8-f318-4dc0-9607-ffa29944c45f service nova] Lock "5f4e6f9b-5413-4399-83ca-9bc78911db38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:22:01 user nova-compute[70975]: DEBUG nova.compute.manager [req-d95c7fc2-0fe8-48dd-9baa-507fd274f82f req-21b144d8-f318-4dc0-9607-ffa29944c45f service nova] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] No waiting events found dispatching network-vif-plugged-4330b5f4-c990-4a7d-aa5f-f95315fddf78 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:22:01 user nova-compute[70975]: WARNING nova.compute.manager [req-d95c7fc2-0fe8-48dd-9baa-507fd274f82f req-21b144d8-f318-4dc0-9607-ffa29944c45f service nova] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Received unexpected event network-vif-plugged-4330b5f4-c990-4a7d-aa5f-f95315fddf78 for instance with vm_state deleted and task_state None. Apr 18 16:22:01 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:22:01 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:22:01 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70975) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 18 16:22:02 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:22:02 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:22:02 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:22:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:03 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:22:03 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Starting heal instance info cache {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 18 16:22:03 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Didn't find any instances for network info cache update. {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 18 16:22:03 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager.update_available_resource {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:22:03 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:22:03 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:22:03 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.003s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:22:03 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Auditing locally available compute resources for user (node: user) {{(pid=70975) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 18 16:22:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:22:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:22:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:22:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:22:03 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6aece7dd-d545-4e26-9cb7-30ee0b01ebb2/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:22:04 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6aece7dd-d545-4e26-9cb7-30ee0b01ebb2/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:22:04 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6aece7dd-d545-4e26-9cb7-30ee0b01ebb2/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:22:04 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6aece7dd-d545-4e26-9cb7-30ee0b01ebb2/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:22:04 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c16a352d-3f0c-4688-a890-81be1fee9f35/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:22:04 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c16a352d-3f0c-4688-a890-81be1fee9f35/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:22:04 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c16a352d-3f0c-4688-a890-81be1fee9f35/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:22:04 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c16a352d-3f0c-4688-a890-81be1fee9f35/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:22:04 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:22:04 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:04 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json" returned: 0 in 0.128s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:22:04 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:22:04 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:22:04 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:22:04 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json" returned: 0 in 0.127s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:22:04 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:22:05 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:22:05 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:22:05 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:22:05 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Hypervisor/Node resource view: name=user free_ram=8495MB free_disk=26.577987670898438GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70975) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 18 16:22:05 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:22:05 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:22:05 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 1b530349-680e-4def-86ef-29c340efa175 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:22:05 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 6528f05a-9f05-4f35-b991-687e4f47029e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:22:05 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance c16a352d-3f0c-4688-a890-81be1fee9f35 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:22:05 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 66df9389-d007-4737-8bb1-55bcb5f227ff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:22:05 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:22:05 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Total usable vcpus: 12, total allocated vcpus: 5 {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 18 16:22:05 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Final resource view: name=user phys_ram=16023MB used_ram=1152MB phys_disk=40GB used_disk=5GB total_vcpus=12 used_vcpus=5 pci_stats=[] {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 18 16:22:05 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:22:05 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:22:05 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Compute_service record updated for user:user {{(pid=70975) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 18 16:22:05 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.304s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:22:07 user nova-compute[70975]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:22:07 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] VM Stopped (Lifecycle Event) Apr 18 16:22:07 user nova-compute[70975]: DEBUG nova.compute.manager [None req-22599589-568b-419c-9d0c-d8673e1f50db None None] [instance: f6d6085d-9e15-4e29-ab90-3a8928971324] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:22:07 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:07 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:22:09 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:22:09 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:12 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:14 user nova-compute[70975]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:22:14 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] VM Stopped (Lifecycle Event) Apr 18 16:22:14 user nova-compute[70975]: DEBUG nova.compute.manager [None req-4eca4609-3e8c-4854-841b-020393b0ddc7 None None] [instance: 5f4e6f9b-5413-4399-83ca-9bc78911db38] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:22:14 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:19 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:22 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:24 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:26 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:29 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:32 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:33 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-7fbf1da3-a69e-4e59-9913-ba1d79e02b7a tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Acquiring lock "c16a352d-3f0c-4688-a890-81be1fee9f35" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:22:33 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-7fbf1da3-a69e-4e59-9913-ba1d79e02b7a tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Lock "c16a352d-3f0c-4688-a890-81be1fee9f35" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:22:33 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-7fbf1da3-a69e-4e59-9913-ba1d79e02b7a tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Acquiring lock "c16a352d-3f0c-4688-a890-81be1fee9f35-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:22:33 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-7fbf1da3-a69e-4e59-9913-ba1d79e02b7a tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Lock "c16a352d-3f0c-4688-a890-81be1fee9f35-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:22:33 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-7fbf1da3-a69e-4e59-9913-ba1d79e02b7a tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Lock "c16a352d-3f0c-4688-a890-81be1fee9f35-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:22:33 user nova-compute[70975]: INFO nova.compute.manager [None req-7fbf1da3-a69e-4e59-9913-ba1d79e02b7a tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Terminating instance Apr 18 16:22:33 user nova-compute[70975]: DEBUG nova.compute.manager [None req-7fbf1da3-a69e-4e59-9913-ba1d79e02b7a tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Start destroying the instance on the hypervisor. {{(pid=70975) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 18 16:22:33 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:33 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:34 user nova-compute[70975]: DEBUG nova.compute.manager [req-ccf83a26-c122-461b-b534-fb7efec2a9a1 req-9a8ebd8d-661a-46cb-ab2a-3a5208f3d490 service nova] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Received event network-vif-unplugged-159a1d71-92e3-4ac2-ad79-f530a49580e9 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:22:34 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-ccf83a26-c122-461b-b534-fb7efec2a9a1 req-9a8ebd8d-661a-46cb-ab2a-3a5208f3d490 service nova] Acquiring lock "c16a352d-3f0c-4688-a890-81be1fee9f35-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:22:34 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-ccf83a26-c122-461b-b534-fb7efec2a9a1 req-9a8ebd8d-661a-46cb-ab2a-3a5208f3d490 service nova] Lock "c16a352d-3f0c-4688-a890-81be1fee9f35-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:22:34 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-ccf83a26-c122-461b-b534-fb7efec2a9a1 req-9a8ebd8d-661a-46cb-ab2a-3a5208f3d490 service nova] Lock "c16a352d-3f0c-4688-a890-81be1fee9f35-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:22:34 user nova-compute[70975]: DEBUG nova.compute.manager [req-ccf83a26-c122-461b-b534-fb7efec2a9a1 req-9a8ebd8d-661a-46cb-ab2a-3a5208f3d490 service nova] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] No waiting events found dispatching network-vif-unplugged-159a1d71-92e3-4ac2-ad79-f530a49580e9 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:22:34 user nova-compute[70975]: DEBUG nova.compute.manager [req-ccf83a26-c122-461b-b534-fb7efec2a9a1 req-9a8ebd8d-661a-46cb-ab2a-3a5208f3d490 service nova] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Received event network-vif-unplugged-159a1d71-92e3-4ac2-ad79-f530a49580e9 for instance with task_state deleting. {{(pid=70975) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 18 16:22:34 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Instance destroyed successfully. Apr 18 16:22:34 user nova-compute[70975]: DEBUG nova.objects.instance [None req-7fbf1da3-a69e-4e59-9913-ba1d79e02b7a tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Lazy-loading 'resources' on Instance uuid c16a352d-3f0c-4688-a890-81be1fee9f35 {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:22:34 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-7fbf1da3-a69e-4e59-9913-ba1d79e02b7a tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:20:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-1247400120',display_name='tempest-VolumesActionsTest-instance-1247400120',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesactionstest-instance-1247400120',id=18,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-18T16:20:49Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='fb49d5ddf6db4b1d807ba42fd37a919d',ramdisk_id='',reservation_id='r-2ruy1cgq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-VolumesActionsTest-1956026439',owner_user_name='tempest-VolumesActionsTest-1956026439-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-18T16:20:49Z,user_data=None,user_id='3e222f73f9194870b4fac305cdd60f3a',uuid=c16a352d-3f0c-4688-a890-81be1fee9f35,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "159a1d71-92e3-4ac2-ad79-f530a49580e9", "address": "fa:16:3e:f9:70:73", "network": {"id": "a4882813-5e0c-44e4-b47a-bc4b69fbead5", "bridge": "br-int", "label": "tempest-VolumesActionsTest-2022397269-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "fb49d5ddf6db4b1d807ba42fd37a919d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap159a1d71-92", "ovs_interfaceid": "159a1d71-92e3-4ac2-ad79-f530a49580e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 18 16:22:34 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-7fbf1da3-a69e-4e59-9913-ba1d79e02b7a tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Converting VIF {"id": "159a1d71-92e3-4ac2-ad79-f530a49580e9", "address": "fa:16:3e:f9:70:73", "network": {"id": "a4882813-5e0c-44e4-b47a-bc4b69fbead5", "bridge": "br-int", "label": "tempest-VolumesActionsTest-2022397269-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "fb49d5ddf6db4b1d807ba42fd37a919d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap159a1d71-92", "ovs_interfaceid": "159a1d71-92e3-4ac2-ad79-f530a49580e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:22:34 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-7fbf1da3-a69e-4e59-9913-ba1d79e02b7a tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:70:73,bridge_name='br-int',has_traffic_filtering=True,id=159a1d71-92e3-4ac2-ad79-f530a49580e9,network=Network(a4882813-5e0c-44e4-b47a-bc4b69fbead5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap159a1d71-92') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:22:34 user nova-compute[70975]: DEBUG os_vif [None req-7fbf1da3-a69e-4e59-9913-ba1d79e02b7a tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:70:73,bridge_name='br-int',has_traffic_filtering=True,id=159a1d71-92e3-4ac2-ad79-f530a49580e9,network=Network(a4882813-5e0c-44e4-b47a-bc4b69fbead5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap159a1d71-92') {{(pid=70975) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 18 16:22:34 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:34 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap159a1d71-92, bridge=br-int, if_exists=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:22:34 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:34 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:22:34 user nova-compute[70975]: INFO os_vif [None req-7fbf1da3-a69e-4e59-9913-ba1d79e02b7a tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:70:73,bridge_name='br-int',has_traffic_filtering=True,id=159a1d71-92e3-4ac2-ad79-f530a49580e9,network=Network(a4882813-5e0c-44e4-b47a-bc4b69fbead5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap159a1d71-92') Apr 18 16:22:34 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-7fbf1da3-a69e-4e59-9913-ba1d79e02b7a tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Deleting instance files /opt/stack/data/nova/instances/c16a352d-3f0c-4688-a890-81be1fee9f35_del Apr 18 16:22:34 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-7fbf1da3-a69e-4e59-9913-ba1d79e02b7a tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Deletion of /opt/stack/data/nova/instances/c16a352d-3f0c-4688-a890-81be1fee9f35_del complete Apr 18 16:22:34 user nova-compute[70975]: INFO nova.compute.manager [None req-7fbf1da3-a69e-4e59-9913-ba1d79e02b7a tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Took 0.68 seconds to destroy the instance on the hypervisor. Apr 18 16:22:34 user nova-compute[70975]: DEBUG oslo.service.loopingcall [None req-7fbf1da3-a69e-4e59-9913-ba1d79e02b7a tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70975) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 18 16:22:34 user nova-compute[70975]: DEBUG nova.compute.manager [-] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Deallocating network for instance {{(pid=70975) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 18 16:22:34 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] deallocate_for_instance() {{(pid=70975) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 18 16:22:35 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Updating instance_info_cache with network_info: [] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:22:35 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Took 0.58 seconds to deallocate network for instance. Apr 18 16:22:35 user nova-compute[70975]: DEBUG nova.compute.manager [req-5c5ed526-8739-4770-90e2-4203116ec58c req-ad841434-fde9-4867-884a-cbf1b7088c1e service nova] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Received event network-vif-deleted-159a1d71-92e3-4ac2-ad79-f530a49580e9 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:22:35 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-7fbf1da3-a69e-4e59-9913-ba1d79e02b7a tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:22:35 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-7fbf1da3-a69e-4e59-9913-ba1d79e02b7a tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:22:35 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-7fbf1da3-a69e-4e59-9913-ba1d79e02b7a tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:22:35 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-7fbf1da3-a69e-4e59-9913-ba1d79e02b7a tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:22:35 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-7fbf1da3-a69e-4e59-9913-ba1d79e02b7a tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.221s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:22:35 user nova-compute[70975]: INFO nova.scheduler.client.report [None req-7fbf1da3-a69e-4e59-9913-ba1d79e02b7a tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Deleted allocations for instance c16a352d-3f0c-4688-a890-81be1fee9f35 Apr 18 16:22:35 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-7fbf1da3-a69e-4e59-9913-ba1d79e02b7a tempest-VolumesActionsTest-1956026439 tempest-VolumesActionsTest-1956026439-project-member] Lock "c16a352d-3f0c-4688-a890-81be1fee9f35" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.669s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:22:36 user nova-compute[70975]: DEBUG nova.compute.manager [req-b32f23c7-489d-44c9-a3b6-ff5bcecb95d8 req-0568c84f-44a2-4298-94e2-328ab94c90cc service nova] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Received event network-vif-plugged-159a1d71-92e3-4ac2-ad79-f530a49580e9 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:22:36 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-b32f23c7-489d-44c9-a3b6-ff5bcecb95d8 req-0568c84f-44a2-4298-94e2-328ab94c90cc service nova] Acquiring lock "c16a352d-3f0c-4688-a890-81be1fee9f35-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:22:36 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-b32f23c7-489d-44c9-a3b6-ff5bcecb95d8 req-0568c84f-44a2-4298-94e2-328ab94c90cc service nova] Lock "c16a352d-3f0c-4688-a890-81be1fee9f35-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:22:36 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-b32f23c7-489d-44c9-a3b6-ff5bcecb95d8 req-0568c84f-44a2-4298-94e2-328ab94c90cc service nova] Lock "c16a352d-3f0c-4688-a890-81be1fee9f35-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:22:36 user nova-compute[70975]: DEBUG nova.compute.manager [req-b32f23c7-489d-44c9-a3b6-ff5bcecb95d8 req-0568c84f-44a2-4298-94e2-328ab94c90cc service nova] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] No waiting events found dispatching network-vif-plugged-159a1d71-92e3-4ac2-ad79-f530a49580e9 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:22:36 user nova-compute[70975]: WARNING nova.compute.manager [req-b32f23c7-489d-44c9-a3b6-ff5bcecb95d8 req-0568c84f-44a2-4298-94e2-328ab94c90cc service nova] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Received unexpected event network-vif-plugged-159a1d71-92e3-4ac2-ad79-f530a49580e9 for instance with vm_state deleted and task_state None. Apr 18 16:22:37 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:37 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:39 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:43 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Acquiring lock "a4febff2-74e8-47ef-820d-f407a4d22d9d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:22:43 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lock "a4febff2-74e8-47ef-820d-f407a4d22d9d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:22:43 user nova-compute[70975]: DEBUG nova.compute.manager [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Starting instance... {{(pid=70975) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 18 16:22:43 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:22:43 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:22:43 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70975) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 18 16:22:43 user nova-compute[70975]: INFO nova.compute.claims [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Claim successful on node user Apr 18 16:22:43 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:22:43 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:22:43 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.272s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:22:43 user nova-compute[70975]: DEBUG nova.compute.manager [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Start building networks asynchronously for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 18 16:22:43 user nova-compute[70975]: DEBUG nova.compute.manager [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Allocating IP information in the background. {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 18 16:22:43 user nova-compute[70975]: DEBUG nova.network.neutron [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] allocate_for_instance() {{(pid=70975) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 18 16:22:43 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 18 16:22:43 user nova-compute[70975]: DEBUG nova.compute.manager [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Start building block device mappings for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 18 16:22:43 user nova-compute[70975]: DEBUG nova.policy [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2963911de4f34d79816a9a1f9ad24a27', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5695adbb14ea4162bc40547b1509a1e4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70975) authorize /opt/stack/nova/nova/policy.py:203}} Apr 18 16:22:43 user nova-compute[70975]: DEBUG nova.compute.manager [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Start spawning the instance on the hypervisor. {{(pid=70975) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 18 16:22:43 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Creating instance directory {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 18 16:22:43 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Creating image(s) Apr 18 16:22:43 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Acquiring lock "/opt/stack/data/nova/instances/a4febff2-74e8-47ef-820d-f407a4d22d9d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:22:43 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lock "/opt/stack/data/nova/instances/a4febff2-74e8-47ef-820d-f407a4d22d9d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:22:43 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lock "/opt/stack/data/nova/instances/a4febff2-74e8-47ef-820d-f407a4d22d9d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:22:43 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:22:43 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.141s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:22:43 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Acquiring lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:22:43 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:22:43 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:22:44 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.140s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:22:44 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/a4febff2-74e8-47ef-820d-f407a4d22d9d/disk 1073741824 {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:22:44 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/a4febff2-74e8-47ef-820d-f407a4d22d9d/disk 1073741824" returned: 0 in 0.055s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:22:44 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lock "72bd97915ab7c08468b7f34ddcae11f3f23c8053" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.202s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:22:44 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:22:44 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.156s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:22:44 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Checking if we can resize image /opt/stack/data/nova/instances/a4febff2-74e8-47ef-820d-f407a4d22d9d/disk. size=1073741824 {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 18 16:22:44 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a4febff2-74e8-47ef-820d-f407a4d22d9d/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:22:44 user nova-compute[70975]: DEBUG nova.network.neutron [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Successfully created port: 5e7e767e-18ff-4103-8ea8-ce2a0375d42e {{(pid=70975) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 18 16:22:44 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:44 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a4febff2-74e8-47ef-820d-f407a4d22d9d/disk --force-share --output=json" returned: 0 in 0.166s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:22:44 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Cannot resize image /opt/stack/data/nova/instances/a4febff2-74e8-47ef-820d-f407a4d22d9d/disk to a smaller size. {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 18 16:22:44 user nova-compute[70975]: DEBUG nova.objects.instance [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lazy-loading 'migration_context' on Instance uuid a4febff2-74e8-47ef-820d-f407a4d22d9d {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:22:44 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Created local disks {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 18 16:22:44 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Ensure instance console log exists: /opt/stack/data/nova/instances/a4febff2-74e8-47ef-820d-f407a4d22d9d/console.log {{(pid=70975) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 18 16:22:44 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:22:44 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:22:44 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:22:45 user nova-compute[70975]: DEBUG nova.network.neutron [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Successfully updated port: 5e7e767e-18ff-4103-8ea8-ce2a0375d42e {{(pid=70975) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 18 16:22:45 user nova-compute[70975]: DEBUG nova.compute.manager [req-1fd51e38-cf5b-4ae3-a560-52260b286367 req-917ffe90-f630-4728-a7bc-860bf6650e8f service nova] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Received event network-changed-5e7e767e-18ff-4103-8ea8-ce2a0375d42e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:22:45 user nova-compute[70975]: DEBUG nova.compute.manager [req-1fd51e38-cf5b-4ae3-a560-52260b286367 req-917ffe90-f630-4728-a7bc-860bf6650e8f service nova] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Refreshing instance network info cache due to event network-changed-5e7e767e-18ff-4103-8ea8-ce2a0375d42e. {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 18 16:22:45 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-1fd51e38-cf5b-4ae3-a560-52260b286367 req-917ffe90-f630-4728-a7bc-860bf6650e8f service nova] Acquiring lock "refresh_cache-a4febff2-74e8-47ef-820d-f407a4d22d9d" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:22:45 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-1fd51e38-cf5b-4ae3-a560-52260b286367 req-917ffe90-f630-4728-a7bc-860bf6650e8f service nova] Acquired lock "refresh_cache-a4febff2-74e8-47ef-820d-f407a4d22d9d" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:22:45 user nova-compute[70975]: DEBUG nova.network.neutron [req-1fd51e38-cf5b-4ae3-a560-52260b286367 req-917ffe90-f630-4728-a7bc-860bf6650e8f service nova] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Refreshing network info cache for port 5e7e767e-18ff-4103-8ea8-ce2a0375d42e {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 18 16:22:45 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Acquiring lock "refresh_cache-a4febff2-74e8-47ef-820d-f407a4d22d9d" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:22:45 user nova-compute[70975]: DEBUG nova.network.neutron [req-1fd51e38-cf5b-4ae3-a560-52260b286367 req-917ffe90-f630-4728-a7bc-860bf6650e8f service nova] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Instance cache missing network info. {{(pid=70975) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 18 16:22:45 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG nova.network.neutron [req-1fd51e38-cf5b-4ae3-a560-52260b286367 req-917ffe90-f630-4728-a7bc-860bf6650e8f service nova] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Updating instance_info_cache with network_info: [] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-1fd51e38-cf5b-4ae3-a560-52260b286367 req-917ffe90-f630-4728-a7bc-860bf6650e8f service nova] Releasing lock "refresh_cache-a4febff2-74e8-47ef-820d-f407a4d22d9d" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Acquired lock "refresh_cache-a4febff2-74e8-47ef-820d-f407a4d22d9d" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG nova.network.neutron [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Building network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG nova.network.neutron [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Instance cache missing network info. {{(pid=70975) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG nova.network.neutron [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Updating instance_info_cache with network_info: [{"id": "5e7e767e-18ff-4103-8ea8-ce2a0375d42e", "address": "fa:16:3e:32:2a:5e", "network": {"id": "236fa8aa-433b-4dfa-a787-f165c3389489", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1486162327-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5695adbb14ea4162bc40547b1509a1e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e7e767e-18", "ovs_interfaceid": "5e7e767e-18ff-4103-8ea8-ce2a0375d42e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Releasing lock "refresh_cache-a4febff2-74e8-47ef-820d-f407a4d22d9d" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG nova.compute.manager [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Instance network_info: |[{"id": "5e7e767e-18ff-4103-8ea8-ce2a0375d42e", "address": "fa:16:3e:32:2a:5e", "network": {"id": "236fa8aa-433b-4dfa-a787-f165c3389489", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1486162327-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5695adbb14ea4162bc40547b1509a1e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e7e767e-18", "ovs_interfaceid": "5e7e767e-18ff-4103-8ea8-ce2a0375d42e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Start _get_guest_xml network_info=[{"id": "5e7e767e-18ff-4103-8ea8-ce2a0375d42e", "address": "fa:16:3e:32:2a:5e", "network": {"id": "236fa8aa-433b-4dfa-a787-f165c3389489", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1486162327-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5695adbb14ea4162bc40547b1509a1e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e7e767e-18", "ovs_interfaceid": "5e7e767e-18ff-4103-8ea8-ce2a0375d42e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encrypted': False, 'device_type': 'disk', 'encryption_secret_uuid': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'b11a20de-f82a-4158-b53e-0a0c7a1552cb'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 18 16:22:46 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:22:46 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:22:46 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70975) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-18T16:11:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:10:22Z,direct_url=,disk_format='qcow2',id=b11a20de-f82a-4158-b53e-0a0c7a1552cb,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='da33a00a857643d4b924633c3a187f34',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:10:24Z,virtual_size=,visibility=), allow threads: True {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Flavor limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Image limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Flavor pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Image pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Got 1 possible topologies {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:22:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1167960863',display_name='tempest-ServersNegativeTestJSON-server-1167960863',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1167960863',id=22,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5695adbb14ea4162bc40547b1509a1e4',ramdisk_id='',reservation_id='r-lgt4t5zg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1696086909',owner_user_name='tempest-ServersNegativeTestJSON-1696086909-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:22:44Z,user_data=None,user_id='2963911de4f34d79816a9a1f9ad24a27',uuid=a4febff2-74e8-47ef-820d-f407a4d22d9d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5e7e767e-18ff-4103-8ea8-ce2a0375d42e", "address": "fa:16:3e:32:2a:5e", "network": {"id": "236fa8aa-433b-4dfa-a787-f165c3389489", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1486162327-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5695adbb14ea4162bc40547b1509a1e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e7e767e-18", "ovs_interfaceid": "5e7e767e-18ff-4103-8ea8-ce2a0375d42e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70975) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Converting VIF {"id": "5e7e767e-18ff-4103-8ea8-ce2a0375d42e", "address": "fa:16:3e:32:2a:5e", "network": {"id": "236fa8aa-433b-4dfa-a787-f165c3389489", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1486162327-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5695adbb14ea4162bc40547b1509a1e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e7e767e-18", "ovs_interfaceid": "5e7e767e-18ff-4103-8ea8-ce2a0375d42e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:32:2a:5e,bridge_name='br-int',has_traffic_filtering=True,id=5e7e767e-18ff-4103-8ea8-ce2a0375d42e,network=Network(236fa8aa-433b-4dfa-a787-f165c3389489),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e7e767e-18') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG nova.objects.instance [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lazy-loading 'pci_devices' on Instance uuid a4febff2-74e8-47ef-820d-f407a4d22d9d {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] End _get_guest_xml xml= Apr 18 16:22:46 user nova-compute[70975]: a4febff2-74e8-47ef-820d-f407a4d22d9d Apr 18 16:22:46 user nova-compute[70975]: instance-00000016 Apr 18 16:22:46 user nova-compute[70975]: 131072 Apr 18 16:22:46 user nova-compute[70975]: 1 Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: tempest-ServersNegativeTestJSON-server-1167960863 Apr 18 16:22:46 user nova-compute[70975]: 2023-04-18 16:22:46 Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: 128 Apr 18 16:22:46 user nova-compute[70975]: 1 Apr 18 16:22:46 user nova-compute[70975]: 0 Apr 18 16:22:46 user nova-compute[70975]: 0 Apr 18 16:22:46 user nova-compute[70975]: 1 Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: tempest-ServersNegativeTestJSON-1696086909-project-member Apr 18 16:22:46 user nova-compute[70975]: tempest-ServersNegativeTestJSON-1696086909 Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: OpenStack Foundation Apr 18 16:22:46 user nova-compute[70975]: OpenStack Nova Apr 18 16:22:46 user nova-compute[70975]: 0.0.0 Apr 18 16:22:46 user nova-compute[70975]: a4febff2-74e8-47ef-820d-f407a4d22d9d Apr 18 16:22:46 user nova-compute[70975]: a4febff2-74e8-47ef-820d-f407a4d22d9d Apr 18 16:22:46 user nova-compute[70975]: Virtual Machine Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: hvm Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Nehalem Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: /dev/urandom Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: Apr 18 16:22:46 user nova-compute[70975]: {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:22:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1167960863',display_name='tempest-ServersNegativeTestJSON-server-1167960863',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1167960863',id=22,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5695adbb14ea4162bc40547b1509a1e4',ramdisk_id='',reservation_id='r-lgt4t5zg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1696086909',owner_user_name='tempest-ServersNegativeTestJSON-1696086909-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:22:44Z,user_data=None,user_id='2963911de4f34d79816a9a1f9ad24a27',uuid=a4febff2-74e8-47ef-820d-f407a4d22d9d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5e7e767e-18ff-4103-8ea8-ce2a0375d42e", "address": "fa:16:3e:32:2a:5e", "network": {"id": "236fa8aa-433b-4dfa-a787-f165c3389489", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1486162327-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5695adbb14ea4162bc40547b1509a1e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e7e767e-18", "ovs_interfaceid": "5e7e767e-18ff-4103-8ea8-ce2a0375d42e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Converting VIF {"id": "5e7e767e-18ff-4103-8ea8-ce2a0375d42e", "address": "fa:16:3e:32:2a:5e", "network": {"id": "236fa8aa-433b-4dfa-a787-f165c3389489", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1486162327-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5695adbb14ea4162bc40547b1509a1e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e7e767e-18", "ovs_interfaceid": "5e7e767e-18ff-4103-8ea8-ce2a0375d42e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:32:2a:5e,bridge_name='br-int',has_traffic_filtering=True,id=5e7e767e-18ff-4103-8ea8-ce2a0375d42e,network=Network(236fa8aa-433b-4dfa-a787-f165c3389489),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e7e767e-18') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG os_vif [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:2a:5e,bridge_name='br-int',has_traffic_filtering=True,id=5e7e767e-18ff-4103-8ea8-ce2a0375d42e,network=Network(236fa8aa-433b-4dfa-a787-f165c3389489),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e7e767e-18') {{(pid=70975) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5e7e767e-18, may_exist=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5e7e767e-18, col_values=(('external_ids', {'iface-id': '5e7e767e-18ff-4103-8ea8-ce2a0375d42e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:32:2a:5e', 'vm-uuid': 'a4febff2-74e8-47ef-820d-f407a4d22d9d'}),)) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:46 user nova-compute[70975]: INFO os_vif [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:2a:5e,bridge_name='br-int',has_traffic_filtering=True,id=5e7e767e-18ff-4103-8ea8-ce2a0375d42e,network=Network(236fa8aa-433b-4dfa-a787-f165c3389489),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e7e767e-18') Apr 18 16:22:46 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] No BDM found with device name vda, not building metadata. {{(pid=70975) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 18 16:22:46 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] No VIF found with MAC fa:16:3e:32:2a:5e, not building metadata {{(pid=70975) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 18 16:22:48 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:48 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:48 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:48 user nova-compute[70975]: DEBUG nova.compute.manager [req-7fece3fe-2f52-43e7-8db1-83567a61bb08 req-5d24f548-1851-4024-9634-0a4b4d3168e0 service nova] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Received event network-vif-plugged-5e7e767e-18ff-4103-8ea8-ce2a0375d42e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:22:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-7fece3fe-2f52-43e7-8db1-83567a61bb08 req-5d24f548-1851-4024-9634-0a4b4d3168e0 service nova] Acquiring lock "a4febff2-74e8-47ef-820d-f407a4d22d9d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:22:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-7fece3fe-2f52-43e7-8db1-83567a61bb08 req-5d24f548-1851-4024-9634-0a4b4d3168e0 service nova] Lock "a4febff2-74e8-47ef-820d-f407a4d22d9d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:22:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-7fece3fe-2f52-43e7-8db1-83567a61bb08 req-5d24f548-1851-4024-9634-0a4b4d3168e0 service nova] Lock "a4febff2-74e8-47ef-820d-f407a4d22d9d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:22:48 user nova-compute[70975]: DEBUG nova.compute.manager [req-7fece3fe-2f52-43e7-8db1-83567a61bb08 req-5d24f548-1851-4024-9634-0a4b4d3168e0 service nova] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] No waiting events found dispatching network-vif-plugged-5e7e767e-18ff-4103-8ea8-ce2a0375d42e {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:22:48 user nova-compute[70975]: WARNING nova.compute.manager [req-7fece3fe-2f52-43e7-8db1-83567a61bb08 req-5d24f548-1851-4024-9634-0a4b4d3168e0 service nova] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Received unexpected event network-vif-plugged-5e7e767e-18ff-4103-8ea8-ce2a0375d42e for instance with vm_state building and task_state spawning. Apr 18 16:22:48 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:49 user nova-compute[70975]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:22:49 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] VM Stopped (Lifecycle Event) Apr 18 16:22:49 user nova-compute[70975]: DEBUG nova.compute.manager [None req-2b2eba7e-cbb0-4fa0-872c-b84c6d57ff5d None None] [instance: c16a352d-3f0c-4688-a890-81be1fee9f35] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:22:50 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Resumed> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:22:50 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] VM Resumed (Lifecycle Event) Apr 18 16:22:50 user nova-compute[70975]: DEBUG nova.compute.manager [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Instance event wait completed in 0 seconds for {{(pid=70975) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 18 16:22:50 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Guest created on hypervisor {{(pid=70975) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 18 16:22:50 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Instance spawned successfully. Apr 18 16:22:50 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 18 16:22:50 user nova-compute[70975]: DEBUG nova.compute.manager [req-eaee9eb1-2933-4866-9755-40c0dd322de2 req-ff55e858-6ab0-4671-9c8b-cef276481cab service nova] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Received event network-vif-plugged-5e7e767e-18ff-4103-8ea8-ce2a0375d42e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:22:50 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-eaee9eb1-2933-4866-9755-40c0dd322de2 req-ff55e858-6ab0-4671-9c8b-cef276481cab service nova] Acquiring lock "a4febff2-74e8-47ef-820d-f407a4d22d9d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:22:50 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-eaee9eb1-2933-4866-9755-40c0dd322de2 req-ff55e858-6ab0-4671-9c8b-cef276481cab service nova] Lock "a4febff2-74e8-47ef-820d-f407a4d22d9d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:22:50 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-eaee9eb1-2933-4866-9755-40c0dd322de2 req-ff55e858-6ab0-4671-9c8b-cef276481cab service nova] Lock "a4febff2-74e8-47ef-820d-f407a4d22d9d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:22:50 user nova-compute[70975]: DEBUG nova.compute.manager [req-eaee9eb1-2933-4866-9755-40c0dd322de2 req-ff55e858-6ab0-4671-9c8b-cef276481cab service nova] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] No waiting events found dispatching network-vif-plugged-5e7e767e-18ff-4103-8ea8-ce2a0375d42e {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:22:50 user nova-compute[70975]: WARNING nova.compute.manager [req-eaee9eb1-2933-4866-9755-40c0dd322de2 req-ff55e858-6ab0-4671-9c8b-cef276481cab service nova] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Received unexpected event network-vif-plugged-5e7e767e-18ff-4103-8ea8-ce2a0375d42e for instance with vm_state building and task_state spawning. Apr 18 16:22:50 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:22:50 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:22:50 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Found default for hw_cdrom_bus of ide {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:22:50 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Found default for hw_disk_bus of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:22:50 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Found default for hw_input_bus of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:22:50 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Found default for hw_pointer_model of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:22:50 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Found default for hw_video_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:22:50 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Found default for hw_vif_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:22:50 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:22:50 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Started> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:22:50 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] VM Started (Lifecycle Event) Apr 18 16:22:50 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:22:50 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:22:50 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:22:50 user nova-compute[70975]: INFO nova.compute.manager [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Took 6.70 seconds to spawn the instance on the hypervisor. Apr 18 16:22:50 user nova-compute[70975]: DEBUG nova.compute.manager [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:22:50 user nova-compute[70975]: INFO nova.compute.manager [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Took 7.33 seconds to build instance. Apr 18 16:22:50 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-bba57677-8b4a-45e0-b074-ac345bb3e2ee tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lock "a4febff2-74e8-47ef-820d-f407a4d22d9d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.436s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:22:51 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:51 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:52 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:54 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:55 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:56 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:57 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:58 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:58 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-6c11fec6-beb0-4171-a168-e3af82f534b2 tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Acquiring lock "1b530349-680e-4def-86ef-29c340efa175" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:22:58 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-6c11fec6-beb0-4171-a168-e3af82f534b2 tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Lock "1b530349-680e-4def-86ef-29c340efa175" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:22:58 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-6c11fec6-beb0-4171-a168-e3af82f534b2 tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Acquiring lock "1b530349-680e-4def-86ef-29c340efa175-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:22:58 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-6c11fec6-beb0-4171-a168-e3af82f534b2 tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Lock "1b530349-680e-4def-86ef-29c340efa175-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:22:58 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-6c11fec6-beb0-4171-a168-e3af82f534b2 tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Lock "1b530349-680e-4def-86ef-29c340efa175-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:22:58 user nova-compute[70975]: INFO nova.compute.manager [None req-6c11fec6-beb0-4171-a168-e3af82f534b2 tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] [instance: 1b530349-680e-4def-86ef-29c340efa175] Terminating instance Apr 18 16:22:58 user nova-compute[70975]: DEBUG nova.compute.manager [None req-6c11fec6-beb0-4171-a168-e3af82f534b2 tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] [instance: 1b530349-680e-4def-86ef-29c340efa175] Start destroying the instance on the hypervisor. {{(pid=70975) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 18 16:22:59 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:59 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:59 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:59 user nova-compute[70975]: DEBUG nova.compute.manager [req-c9d1a52e-8efd-47ee-86eb-3526a3c4cd0b req-cf5c5ebf-1f86-403a-bd06-4bf8d404e434 service nova] [instance: 1b530349-680e-4def-86ef-29c340efa175] Received event network-vif-unplugged-64d26c20-add4-4a63-bace-6a3678032692 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:22:59 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-c9d1a52e-8efd-47ee-86eb-3526a3c4cd0b req-cf5c5ebf-1f86-403a-bd06-4bf8d404e434 service nova] Acquiring lock "1b530349-680e-4def-86ef-29c340efa175-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:22:59 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-c9d1a52e-8efd-47ee-86eb-3526a3c4cd0b req-cf5c5ebf-1f86-403a-bd06-4bf8d404e434 service nova] Lock "1b530349-680e-4def-86ef-29c340efa175-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:22:59 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-c9d1a52e-8efd-47ee-86eb-3526a3c4cd0b req-cf5c5ebf-1f86-403a-bd06-4bf8d404e434 service nova] Lock "1b530349-680e-4def-86ef-29c340efa175-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:22:59 user nova-compute[70975]: DEBUG nova.compute.manager [req-c9d1a52e-8efd-47ee-86eb-3526a3c4cd0b req-cf5c5ebf-1f86-403a-bd06-4bf8d404e434 service nova] [instance: 1b530349-680e-4def-86ef-29c340efa175] No waiting events found dispatching network-vif-unplugged-64d26c20-add4-4a63-bace-6a3678032692 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:22:59 user nova-compute[70975]: DEBUG nova.compute.manager [req-c9d1a52e-8efd-47ee-86eb-3526a3c4cd0b req-cf5c5ebf-1f86-403a-bd06-4bf8d404e434 service nova] [instance: 1b530349-680e-4def-86ef-29c340efa175] Received event network-vif-unplugged-64d26c20-add4-4a63-bace-6a3678032692 for instance with task_state deleting. {{(pid=70975) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 18 16:22:59 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: 1b530349-680e-4def-86ef-29c340efa175] Instance destroyed successfully. Apr 18 16:22:59 user nova-compute[70975]: DEBUG nova.objects.instance [None req-6c11fec6-beb0-4171-a168-e3af82f534b2 tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Lazy-loading 'resources' on Instance uuid 1b530349-680e-4def-86ef-29c340efa175 {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:22:59 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-6c11fec6-beb0-4171-a168-e3af82f534b2 tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:14:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1074846308',display_name='tempest-ServerActionsTestJSON-server-1074846308',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serveractionstestjson-server-1074846308',id=3,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAYfshyMt8uY2q2eUCQtkPkI3nGNlhhmmc9vp/6UdeXopca0J7dByvvp0JsnsKIVnxALXrFdF6MbHDsrQpV6fGcr4UECEAJuS6I1V5v6lY3+aDsuDcDzQvqgi06XGLFiPA==',key_name='tempest-keypair-228479226',keypairs=,launch_index=0,launched_at=2023-04-18T16:14:23Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='caa61b19cc4e4cd4bb7d41291c40ef1f',ramdisk_id='',reservation_id='r-1w99jwsl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerActionsTestJSON-1239704997',owner_user_name='tempest-ServerActionsTestJSON-1239704997-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-18T16:14:24Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='07b7b9d8fdcf42f29e83e755f4f27380',uuid=1b530349-680e-4def-86ef-29c340efa175,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "64d26c20-add4-4a63-bace-6a3678032692", "address": "fa:16:3e:33:ec:46", "network": {"id": "f5beaf4a-eeaf-454b-bde5-dd5e1f15e9dd", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-215585786-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "caa61b19cc4e4cd4bb7d41291c40ef1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap64d26c20-ad", "ovs_interfaceid": "64d26c20-add4-4a63-bace-6a3678032692", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 18 16:22:59 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-6c11fec6-beb0-4171-a168-e3af82f534b2 tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Converting VIF {"id": "64d26c20-add4-4a63-bace-6a3678032692", "address": "fa:16:3e:33:ec:46", "network": {"id": "f5beaf4a-eeaf-454b-bde5-dd5e1f15e9dd", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-215585786-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "caa61b19cc4e4cd4bb7d41291c40ef1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap64d26c20-ad", "ovs_interfaceid": "64d26c20-add4-4a63-bace-6a3678032692", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:22:59 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-6c11fec6-beb0-4171-a168-e3af82f534b2 tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:33:ec:46,bridge_name='br-int',has_traffic_filtering=True,id=64d26c20-add4-4a63-bace-6a3678032692,network=Network(f5beaf4a-eeaf-454b-bde5-dd5e1f15e9dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap64d26c20-ad') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:22:59 user nova-compute[70975]: DEBUG os_vif [None req-6c11fec6-beb0-4171-a168-e3af82f534b2 tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:33:ec:46,bridge_name='br-int',has_traffic_filtering=True,id=64d26c20-add4-4a63-bace-6a3678032692,network=Network(f5beaf4a-eeaf-454b-bde5-dd5e1f15e9dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap64d26c20-ad') {{(pid=70975) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 18 16:22:59 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:59 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap64d26c20-ad, bridge=br-int, if_exists=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:22:59 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:59 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:22:59 user nova-compute[70975]: INFO os_vif [None req-6c11fec6-beb0-4171-a168-e3af82f534b2 tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:33:ec:46,bridge_name='br-int',has_traffic_filtering=True,id=64d26c20-add4-4a63-bace-6a3678032692,network=Network(f5beaf4a-eeaf-454b-bde5-dd5e1f15e9dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap64d26c20-ad') Apr 18 16:22:59 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-6c11fec6-beb0-4171-a168-e3af82f534b2 tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] [instance: 1b530349-680e-4def-86ef-29c340efa175] Deleting instance files /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175_del Apr 18 16:22:59 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-6c11fec6-beb0-4171-a168-e3af82f534b2 tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] [instance: 1b530349-680e-4def-86ef-29c340efa175] Deletion of /opt/stack/data/nova/instances/1b530349-680e-4def-86ef-29c340efa175_del complete Apr 18 16:22:59 user nova-compute[70975]: INFO nova.compute.manager [None req-6c11fec6-beb0-4171-a168-e3af82f534b2 tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] [instance: 1b530349-680e-4def-86ef-29c340efa175] Took 0.86 seconds to destroy the instance on the hypervisor. Apr 18 16:22:59 user nova-compute[70975]: DEBUG oslo.service.loopingcall [None req-6c11fec6-beb0-4171-a168-e3af82f534b2 tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70975) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 18 16:22:59 user nova-compute[70975]: DEBUG nova.compute.manager [-] [instance: 1b530349-680e-4def-86ef-29c340efa175] Deallocating network for instance {{(pid=70975) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 18 16:22:59 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: 1b530349-680e-4def-86ef-29c340efa175] deallocate_for_instance() {{(pid=70975) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 18 16:23:00 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: 1b530349-680e-4def-86ef-29c340efa175] Updating instance_info_cache with network_info: [] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:23:00 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: 1b530349-680e-4def-86ef-29c340efa175] Took 0.78 seconds to deallocate network for instance. Apr 18 16:23:00 user nova-compute[70975]: DEBUG nova.compute.manager [req-e38f87cb-1ece-4555-a3bb-55cec492a488 req-bd40273d-3b15-4d1b-8a10-97b33d6fdaf1 service nova] [instance: 1b530349-680e-4def-86ef-29c340efa175] Received event network-vif-deleted-64d26c20-add4-4a63-bace-6a3678032692 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:23:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-6c11fec6-beb0-4171-a168-e3af82f534b2 tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:23:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-6c11fec6-beb0-4171-a168-e3af82f534b2 tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.003s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:23:00 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-6c11fec6-beb0-4171-a168-e3af82f534b2 tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:23:00 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-6c11fec6-beb0-4171-a168-e3af82f534b2 tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:23:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-6c11fec6-beb0-4171-a168-e3af82f534b2 tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.228s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:23:00 user nova-compute[70975]: INFO nova.scheduler.client.report [None req-6c11fec6-beb0-4171-a168-e3af82f534b2 tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Deleted allocations for instance 1b530349-680e-4def-86ef-29c340efa175 Apr 18 16:23:01 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-6c11fec6-beb0-4171-a168-e3af82f534b2 tempest-ServerActionsTestJSON-1239704997 tempest-ServerActionsTestJSON-1239704997-project-member] Lock "1b530349-680e-4def-86ef-29c340efa175" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.086s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:23:01 user nova-compute[70975]: DEBUG nova.compute.manager [req-603e1586-d378-415d-b4cc-a39160b6ad4e req-5932fdb6-ac7d-46f1-af22-b9732ae9df9e service nova] [instance: 1b530349-680e-4def-86ef-29c340efa175] Received event network-vif-plugged-64d26c20-add4-4a63-bace-6a3678032692 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:23:01 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-603e1586-d378-415d-b4cc-a39160b6ad4e req-5932fdb6-ac7d-46f1-af22-b9732ae9df9e service nova] Acquiring lock "1b530349-680e-4def-86ef-29c340efa175-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:23:01 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-603e1586-d378-415d-b4cc-a39160b6ad4e req-5932fdb6-ac7d-46f1-af22-b9732ae9df9e service nova] Lock "1b530349-680e-4def-86ef-29c340efa175-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:23:01 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-603e1586-d378-415d-b4cc-a39160b6ad4e req-5932fdb6-ac7d-46f1-af22-b9732ae9df9e service nova] Lock "1b530349-680e-4def-86ef-29c340efa175-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:23:01 user nova-compute[70975]: DEBUG nova.compute.manager [req-603e1586-d378-415d-b4cc-a39160b6ad4e req-5932fdb6-ac7d-46f1-af22-b9732ae9df9e service nova] [instance: 1b530349-680e-4def-86ef-29c340efa175] No waiting events found dispatching network-vif-plugged-64d26c20-add4-4a63-bace-6a3678032692 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:23:01 user nova-compute[70975]: WARNING nova.compute.manager [req-603e1586-d378-415d-b4cc-a39160b6ad4e req-5932fdb6-ac7d-46f1-af22-b9732ae9df9e service nova] [instance: 1b530349-680e-4def-86ef-29c340efa175] Received unexpected event network-vif-plugged-64d26c20-add4-4a63-bace-6a3678032692 for instance with vm_state deleted and task_state None. Apr 18 16:23:01 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:23:02 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:23:02 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:23:02 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70975) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 18 16:23:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:23:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:23:03 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:23:04 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:23:04 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:23:04 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Starting heal instance info cache {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 18 16:23:04 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Rebuilding the list of instances to heal {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 18 16:23:04 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "refresh_cache-6528f05a-9f05-4f35-b991-687e4f47029e" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:23:04 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquired lock "refresh_cache-6528f05a-9f05-4f35-b991-687e4f47029e" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:23:04 user nova-compute[70975]: DEBUG nova.network.neutron [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Forcefully refreshing network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 18 16:23:04 user nova-compute[70975]: DEBUG nova.objects.instance [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lazy-loading 'info_cache' on Instance uuid 6528f05a-9f05-4f35-b991-687e4f47029e {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:23:04 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:23:05 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:23:05 user nova-compute[70975]: DEBUG nova.network.neutron [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Updating instance_info_cache with network_info: [{"id": "08164ae1-ace4-4d80-ad79-1741eacfa16e", "address": "fa:16:3e:28:00:5b", "network": {"id": "923d10dc-c67e-4426-9c6e-856e903e2446", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-131274818-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f516f5ec45ca4508841c77f79e8c038b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap08164ae1-ac", "ovs_interfaceid": "08164ae1-ace4-4d80-ad79-1741eacfa16e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:23:05 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Releasing lock "refresh_cache-6528f05a-9f05-4f35-b991-687e4f47029e" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:23:05 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Updated the network info_cache for instance {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 18 16:23:05 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:23:05 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:23:05 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager.update_available_resource {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:23:05 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:23:05 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:23:05 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:23:05 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Auditing locally available compute resources for user (node: user) {{(pid=70975) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 18 16:23:05 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:23:05 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json" returned: 0 in 0.151s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:23:05 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:23:05 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json" returned: 0 in 0.146s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:23:05 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6aece7dd-d545-4e26-9cb7-30ee0b01ebb2/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:23:06 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6aece7dd-d545-4e26-9cb7-30ee0b01ebb2/disk --force-share --output=json" returned: 0 in 0.153s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:23:06 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6aece7dd-d545-4e26-9cb7-30ee0b01ebb2/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:23:06 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6aece7dd-d545-4e26-9cb7-30ee0b01ebb2/disk --force-share --output=json" returned: 0 in 0.144s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:23:06 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a4febff2-74e8-47ef-820d-f407a4d22d9d/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:23:06 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a4febff2-74e8-47ef-820d-f407a4d22d9d/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:23:06 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a4febff2-74e8-47ef-820d-f407a4d22d9d/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:23:06 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a4febff2-74e8-47ef-820d-f407a4d22d9d/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:23:06 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:23:06 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json" returned: 0 in 0.148s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:23:06 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:23:06 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:23:07 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:23:07 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:23:07 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Hypervisor/Node resource view: name=user free_ram=8587MB free_disk=26.553882598876953GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70975) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 18 16:23:07 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:23:07 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:23:07 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 6528f05a-9f05-4f35-b991-687e4f47029e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:23:07 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 66df9389-d007-4737-8bb1-55bcb5f227ff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:23:07 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:23:07 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance a4febff2-74e8-47ef-820d-f407a4d22d9d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:23:07 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Total usable vcpus: 12, total allocated vcpus: 4 {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 18 16:23:07 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Final resource view: name=user phys_ram=16023MB used_ram=1024MB phys_disk=40GB used_disk=4GB total_vcpus=12 used_vcpus=4 pci_stats=[] {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 18 16:23:07 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:23:07 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:23:07 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:23:07 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Compute_service record updated for user:user {{(pid=70975) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 18 16:23:07 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.328s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:23:09 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:23:10 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:23:12 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:23:13 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:23:14 user nova-compute[70975]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:23:14 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: 1b530349-680e-4def-86ef-29c340efa175] VM Stopped (Lifecycle Event) Apr 18 16:23:14 user nova-compute[70975]: DEBUG nova.compute.manager [None req-a4307fb7-9401-4f88-b151-3f4ff81fb18f None None] [instance: 1b530349-680e-4def-86ef-29c340efa175] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:23:14 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:23:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:23:19 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:23:22 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:23:24 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:23:29 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:23:34 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:23:37 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:23:37 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:23:39 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:23:41 user nova-compute[70975]: DEBUG nova.compute.manager [None req-4a4db533-5453-40dd-8695-8875503b31a2 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:23:41 user nova-compute[70975]: INFO nova.compute.manager [None req-4a4db533-5453-40dd-8695-8875503b31a2 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] instance snapshotting Apr 18 16:23:41 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-4a4db533-5453-40dd-8695-8875503b31a2 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Beginning live snapshot process Apr 18 16:23:42 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-4a4db533-5453-40dd-8695-8875503b31a2 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6aece7dd-d545-4e26-9cb7-30ee0b01ebb2/disk --force-share --output=json -f qcow2 {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:23:42 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-4a4db533-5453-40dd-8695-8875503b31a2 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6aece7dd-d545-4e26-9cb7-30ee0b01ebb2/disk --force-share --output=json -f qcow2" returned: 0 in 0.133s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:23:42 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-4a4db533-5453-40dd-8695-8875503b31a2 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6aece7dd-d545-4e26-9cb7-30ee0b01ebb2/disk --force-share --output=json -f qcow2 {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:23:42 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-4a4db533-5453-40dd-8695-8875503b31a2 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6aece7dd-d545-4e26-9cb7-30ee0b01ebb2/disk --force-share --output=json -f qcow2" returned: 0 in 0.127s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:23:42 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-4a4db533-5453-40dd-8695-8875503b31a2 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:23:42 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-4a4db533-5453-40dd-8695-8875503b31a2 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053 --force-share --output=json" returned: 0 in 0.143s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:23:42 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-4a4db533-5453-40dd-8695-8875503b31a2 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmps3rkrcm0/7752659c23ac42e29bfa75ecec7e4206.delta 1073741824 {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:23:42 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-4a4db533-5453-40dd-8695-8875503b31a2 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/72bd97915ab7c08468b7f34ddcae11f3f23c8053,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmps3rkrcm0/7752659c23ac42e29bfa75ecec7e4206.delta 1073741824" returned: 0 in 0.058s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:23:42 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-4a4db533-5453-40dd-8695-8875503b31a2 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Quiescing instance not available: QEMU guest agent is not enabled. Apr 18 16:23:43 user nova-compute[70975]: DEBUG nova.virt.libvirt.guest [None req-4a4db533-5453-40dd-8695-8875503b31a2 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] COPY block job progress, current cursor: 0 final cursor: 43778048 {{(pid=70975) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 18 16:23:43 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:23:43 user nova-compute[70975]: DEBUG nova.virt.libvirt.guest [None req-4a4db533-5453-40dd-8695-8875503b31a2 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] COPY block job progress, current cursor: 43778048 final cursor: 43778048 {{(pid=70975) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 18 16:23:43 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-4a4db533-5453-40dd-8695-8875503b31a2 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Skipping quiescing instance: QEMU guest agent is not enabled. Apr 18 16:23:43 user nova-compute[70975]: DEBUG nova.privsep.utils [None req-4a4db533-5453-40dd-8695-8875503b31a2 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=70975) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 18 16:23:43 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-4a4db533-5453-40dd-8695-8875503b31a2 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmps3rkrcm0/7752659c23ac42e29bfa75ecec7e4206.delta /opt/stack/data/nova/instances/snapshots/tmps3rkrcm0/7752659c23ac42e29bfa75ecec7e4206 {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:23:44 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-4a4db533-5453-40dd-8695-8875503b31a2 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmps3rkrcm0/7752659c23ac42e29bfa75ecec7e4206.delta /opt/stack/data/nova/instances/snapshots/tmps3rkrcm0/7752659c23ac42e29bfa75ecec7e4206" returned: 0 in 0.421s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:23:44 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-4a4db533-5453-40dd-8695-8875503b31a2 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Snapshot extracted, beginning image upload Apr 18 16:23:44 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:23:45 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:23:46 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-4a4db533-5453-40dd-8695-8875503b31a2 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Snapshot image upload complete Apr 18 16:23:46 user nova-compute[70975]: INFO nova.compute.manager [None req-4a4db533-5453-40dd-8695-8875503b31a2 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Took 4.81 seconds to snapshot the instance on the hypervisor. Apr 18 16:23:47 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:23:49 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:23:49 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:23:52 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:23:53 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:23:54 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:23:54 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:23:57 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:23:58 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Acquiring lock "776d1402-3e8a-407d-a20d-db46c1a21b23" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:23:58 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lock "776d1402-3e8a-407d-a20d-db46c1a21b23" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:23:58 user nova-compute[70975]: DEBUG nova.compute.manager [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Starting instance... {{(pid=70975) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 18 16:23:58 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:23:58 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:23:58 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70975) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 18 16:23:58 user nova-compute[70975]: INFO nova.compute.claims [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Claim successful on node user Apr 18 16:23:58 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:23:58 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:23:58 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.287s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:23:58 user nova-compute[70975]: DEBUG nova.compute.manager [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Start building networks asynchronously for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 18 16:23:58 user nova-compute[70975]: DEBUG nova.compute.manager [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Allocating IP information in the background. {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 18 16:23:58 user nova-compute[70975]: DEBUG nova.network.neutron [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] allocate_for_instance() {{(pid=70975) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 18 16:23:58 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 18 16:23:58 user nova-compute[70975]: DEBUG nova.compute.manager [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Start building block device mappings for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 18 16:23:58 user nova-compute[70975]: DEBUG nova.policy [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'add4c9d906ba49a590f203e0aa98ab64', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '00d4b993c13b46ea8f80b0caed60a373', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70975) authorize /opt/stack/nova/nova/policy.py:203}} Apr 18 16:23:58 user nova-compute[70975]: DEBUG nova.compute.manager [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Start spawning the instance on the hypervisor. {{(pid=70975) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 18 16:23:58 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Creating instance directory {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 18 16:23:58 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Creating image(s) Apr 18 16:23:58 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Acquiring lock "/opt/stack/data/nova/instances/776d1402-3e8a-407d-a20d-db46c1a21b23/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:23:58 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lock "/opt/stack/data/nova/instances/776d1402-3e8a-407d-a20d-db46c1a21b23/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:23:58 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lock "/opt/stack/data/nova/instances/776d1402-3e8a-407d-a20d-db46c1a21b23/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:23:58 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Acquiring lock "a9675fe7436dfa5332b88f674df3481caa144715" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:23:59 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lock "a9675fe7436dfa5332b88f674df3481caa144715" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:23:59 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/a9675fe7436dfa5332b88f674df3481caa144715.part --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:23:59 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/a9675fe7436dfa5332b88f674df3481caa144715.part --force-share --output=json" returned: 0 in 0.149s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:23:59 user nova-compute[70975]: DEBUG nova.virt.images [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] 96f134fc-75ab-496a-9864-a702b8f45c60 was qcow2, converting to raw {{(pid=70975) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Apr 18 16:23:59 user nova-compute[70975]: DEBUG nova.privsep.utils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=70975) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 18 16:23:59 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/a9675fe7436dfa5332b88f674df3481caa144715.part /opt/stack/data/nova/instances/_base/a9675fe7436dfa5332b88f674df3481caa144715.converted {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:23:59 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/a9675fe7436dfa5332b88f674df3481caa144715.part /opt/stack/data/nova/instances/_base/a9675fe7436dfa5332b88f674df3481caa144715.converted" returned: 0 in 0.117s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:23:59 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/a9675fe7436dfa5332b88f674df3481caa144715.converted --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:23:59 user nova-compute[70975]: DEBUG nova.network.neutron [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Successfully created port: 61df4fc4-1d98-4d7b-b74e-280c61eac6ee {{(pid=70975) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 18 16:23:59 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:23:59 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/a9675fe7436dfa5332b88f674df3481caa144715.converted --force-share --output=json" returned: 0 in 0.138s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:23:59 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lock "a9675fe7436dfa5332b88f674df3481caa144715" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.759s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:23:59 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/a9675fe7436dfa5332b88f674df3481caa144715 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:23:59 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/a9675fe7436dfa5332b88f674df3481caa144715 --force-share --output=json" returned: 0 in 0.129s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:23:59 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Acquiring lock "a9675fe7436dfa5332b88f674df3481caa144715" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:23:59 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lock "a9675fe7436dfa5332b88f674df3481caa144715" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:23:59 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/a9675fe7436dfa5332b88f674df3481caa144715 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:24:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/a9675fe7436dfa5332b88f674df3481caa144715 --force-share --output=json" returned: 0 in 0.135s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:24:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/a9675fe7436dfa5332b88f674df3481caa144715,backing_fmt=raw /opt/stack/data/nova/instances/776d1402-3e8a-407d-a20d-db46c1a21b23/disk 1073741824 {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:24:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/a9675fe7436dfa5332b88f674df3481caa144715,backing_fmt=raw /opt/stack/data/nova/instances/776d1402-3e8a-407d-a20d-db46c1a21b23/disk 1073741824" returned: 0 in 0.048s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:24:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lock "a9675fe7436dfa5332b88f674df3481caa144715" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.191s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:24:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/a9675fe7436dfa5332b88f674df3481caa144715 --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:24:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/a9675fe7436dfa5332b88f674df3481caa144715 --force-share --output=json" returned: 0 in 0.136s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:24:00 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Checking if we can resize image /opt/stack/data/nova/instances/776d1402-3e8a-407d-a20d-db46c1a21b23/disk. size=1073741824 {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 18 16:24:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/776d1402-3e8a-407d-a20d-db46c1a21b23/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:24:00 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/776d1402-3e8a-407d-a20d-db46c1a21b23/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:24:00 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Cannot resize image /opt/stack/data/nova/instances/776d1402-3e8a-407d-a20d-db46c1a21b23/disk to a smaller size. {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 18 16:24:00 user nova-compute[70975]: DEBUG nova.objects.instance [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lazy-loading 'migration_context' on Instance uuid 776d1402-3e8a-407d-a20d-db46c1a21b23 {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:24:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Created local disks {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 18 16:24:00 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Ensure instance console log exists: /opt/stack/data/nova/instances/776d1402-3e8a-407d-a20d-db46c1a21b23/console.log {{(pid=70975) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 18 16:24:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:24:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:24:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:24:00 user nova-compute[70975]: DEBUG nova.network.neutron [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Successfully updated port: 61df4fc4-1d98-4d7b-b74e-280c61eac6ee {{(pid=70975) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 18 16:24:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Acquiring lock "refresh_cache-776d1402-3e8a-407d-a20d-db46c1a21b23" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:24:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Acquired lock "refresh_cache-776d1402-3e8a-407d-a20d-db46c1a21b23" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:24:00 user nova-compute[70975]: DEBUG nova.network.neutron [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Building network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 18 16:24:00 user nova-compute[70975]: DEBUG nova.compute.manager [req-277ad55d-0d5f-4a29-93c1-cd218a58ad67 req-d0ea3c0a-9602-4101-a805-24bf7649a6e6 service nova] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Received event network-changed-61df4fc4-1d98-4d7b-b74e-280c61eac6ee {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:24:00 user nova-compute[70975]: DEBUG nova.compute.manager [req-277ad55d-0d5f-4a29-93c1-cd218a58ad67 req-d0ea3c0a-9602-4101-a805-24bf7649a6e6 service nova] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Refreshing instance network info cache due to event network-changed-61df4fc4-1d98-4d7b-b74e-280c61eac6ee. {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 18 16:24:00 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-277ad55d-0d5f-4a29-93c1-cd218a58ad67 req-d0ea3c0a-9602-4101-a805-24bf7649a6e6 service nova] Acquiring lock "refresh_cache-776d1402-3e8a-407d-a20d-db46c1a21b23" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:24:00 user nova-compute[70975]: DEBUG nova.network.neutron [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Instance cache missing network info. {{(pid=70975) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG nova.network.neutron [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Updating instance_info_cache with network_info: [{"id": "61df4fc4-1d98-4d7b-b74e-280c61eac6ee", "address": "fa:16:3e:87:40:bf", "network": {"id": "1cb85bc6-49f1-46be-91c8-d814b48c2a99", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1818136803-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "00d4b993c13b46ea8f80b0caed60a373", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap61df4fc4-1d", "ovs_interfaceid": "61df4fc4-1d98-4d7b-b74e-280c61eac6ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Releasing lock "refresh_cache-776d1402-3e8a-407d-a20d-db46c1a21b23" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG nova.compute.manager [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Instance network_info: |[{"id": "61df4fc4-1d98-4d7b-b74e-280c61eac6ee", "address": "fa:16:3e:87:40:bf", "network": {"id": "1cb85bc6-49f1-46be-91c8-d814b48c2a99", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1818136803-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "00d4b993c13b46ea8f80b0caed60a373", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap61df4fc4-1d", "ovs_interfaceid": "61df4fc4-1d98-4d7b-b74e-280c61eac6ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-277ad55d-0d5f-4a29-93c1-cd218a58ad67 req-d0ea3c0a-9602-4101-a805-24bf7649a6e6 service nova] Acquired lock "refresh_cache-776d1402-3e8a-407d-a20d-db46c1a21b23" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG nova.network.neutron [req-277ad55d-0d5f-4a29-93c1-cd218a58ad67 req-d0ea3c0a-9602-4101-a805-24bf7649a6e6 service nova] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Refreshing network info cache for port 61df4fc4-1d98-4d7b-b74e-280c61eac6ee {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Start _get_guest_xml network_info=[{"id": "61df4fc4-1d98-4d7b-b74e-280c61eac6ee", "address": "fa:16:3e:87:40:bf", "network": {"id": "1cb85bc6-49f1-46be-91c8-d814b48c2a99", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1818136803-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "00d4b993c13b46ea8f80b0caed60a373", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap61df4fc4-1d", "ovs_interfaceid": "61df4fc4-1d98-4d7b-b74e-280c61eac6ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:23:56Z,direct_url=,disk_format='qcow2',id=96f134fc-75ab-496a-9864-a702b8f45c60,min_disk=0,min_ram=0,name='tempest-scenario-img--2080285726',owner='00d4b993c13b46ea8f80b0caed60a373',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:23:57Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encrypted': False, 'device_type': 'disk', 'encryption_secret_uuid': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': '96f134fc-75ab-496a-9864-a702b8f45c60'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 18 16:24:01 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:24:01 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:24:01 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70975) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-18T16:11:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:23:56Z,direct_url=,disk_format='qcow2',id=96f134fc-75ab-496a-9864-a702b8f45c60,min_disk=0,min_ram=0,name='tempest-scenario-img--2080285726',owner='00d4b993c13b46ea8f80b0caed60a373',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:23:57Z,virtual_size=,visibility=), allow threads: True {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Flavor limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Image limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Flavor pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Image pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Got 1 possible topologies {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:23:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-1809716632',display_name='tempest-TestMinimumBasicScenario-server-1809716632',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-1809716632',id=23,image_ref='96f134fc-75ab-496a-9864-a702b8f45c60',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEaEdQSpuucmjUI7CgJV9d16I0XBov7QClewDi9SugBZN+QXiQWDSnpN5XqyApn5LKFQekmnDneoXwFp0uoqimvSuH7yO+qi9pw7ukWtt3Q/33VBcmVZNDF9YXa9rca7pw==',key_name='tempest-TestMinimumBasicScenario-1759395561',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='00d4b993c13b46ea8f80b0caed60a373',ramdisk_id='',reservation_id='r-kut7p5u6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='96f134fc-75ab-496a-9864-a702b8f45c60',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-386776323',owner_user_name='tempest-TestMinimumBasicScenario-386776323-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:23:59Z,user_data=None,user_id='add4c9d906ba49a590f203e0aa98ab64',uuid=776d1402-3e8a-407d-a20d-db46c1a21b23,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "61df4fc4-1d98-4d7b-b74e-280c61eac6ee", "address": "fa:16:3e:87:40:bf", "network": {"id": "1cb85bc6-49f1-46be-91c8-d814b48c2a99", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1818136803-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "00d4b993c13b46ea8f80b0caed60a373", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap61df4fc4-1d", "ovs_interfaceid": "61df4fc4-1d98-4d7b-b74e-280c61eac6ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70975) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Converting VIF {"id": "61df4fc4-1d98-4d7b-b74e-280c61eac6ee", "address": "fa:16:3e:87:40:bf", "network": {"id": "1cb85bc6-49f1-46be-91c8-d814b48c2a99", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1818136803-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "00d4b993c13b46ea8f80b0caed60a373", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap61df4fc4-1d", "ovs_interfaceid": "61df4fc4-1d98-4d7b-b74e-280c61eac6ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:87:40:bf,bridge_name='br-int',has_traffic_filtering=True,id=61df4fc4-1d98-4d7b-b74e-280c61eac6ee,network=Network(1cb85bc6-49f1-46be-91c8-d814b48c2a99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61df4fc4-1d') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG nova.objects.instance [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lazy-loading 'pci_devices' on Instance uuid 776d1402-3e8a-407d-a20d-db46c1a21b23 {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] End _get_guest_xml xml= Apr 18 16:24:01 user nova-compute[70975]: 776d1402-3e8a-407d-a20d-db46c1a21b23 Apr 18 16:24:01 user nova-compute[70975]: instance-00000017 Apr 18 16:24:01 user nova-compute[70975]: 131072 Apr 18 16:24:01 user nova-compute[70975]: 1 Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: tempest-TestMinimumBasicScenario-server-1809716632 Apr 18 16:24:01 user nova-compute[70975]: 2023-04-18 16:24:01 Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: 128 Apr 18 16:24:01 user nova-compute[70975]: 1 Apr 18 16:24:01 user nova-compute[70975]: 0 Apr 18 16:24:01 user nova-compute[70975]: 0 Apr 18 16:24:01 user nova-compute[70975]: 1 Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: tempest-TestMinimumBasicScenario-386776323-project-member Apr 18 16:24:01 user nova-compute[70975]: tempest-TestMinimumBasicScenario-386776323 Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: OpenStack Foundation Apr 18 16:24:01 user nova-compute[70975]: OpenStack Nova Apr 18 16:24:01 user nova-compute[70975]: 0.0.0 Apr 18 16:24:01 user nova-compute[70975]: 776d1402-3e8a-407d-a20d-db46c1a21b23 Apr 18 16:24:01 user nova-compute[70975]: 776d1402-3e8a-407d-a20d-db46c1a21b23 Apr 18 16:24:01 user nova-compute[70975]: Virtual Machine Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: hvm Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Nehalem Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: /dev/urandom Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: Apr 18 16:24:01 user nova-compute[70975]: {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:23:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-1809716632',display_name='tempest-TestMinimumBasicScenario-server-1809716632',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-1809716632',id=23,image_ref='96f134fc-75ab-496a-9864-a702b8f45c60',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEaEdQSpuucmjUI7CgJV9d16I0XBov7QClewDi9SugBZN+QXiQWDSnpN5XqyApn5LKFQekmnDneoXwFp0uoqimvSuH7yO+qi9pw7ukWtt3Q/33VBcmVZNDF9YXa9rca7pw==',key_name='tempest-TestMinimumBasicScenario-1759395561',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='00d4b993c13b46ea8f80b0caed60a373',ramdisk_id='',reservation_id='r-kut7p5u6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='96f134fc-75ab-496a-9864-a702b8f45c60',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-386776323',owner_user_name='tempest-TestMinimumBasicScenario-386776323-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:23:59Z,user_data=None,user_id='add4c9d906ba49a590f203e0aa98ab64',uuid=776d1402-3e8a-407d-a20d-db46c1a21b23,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "61df4fc4-1d98-4d7b-b74e-280c61eac6ee", "address": "fa:16:3e:87:40:bf", "network": {"id": "1cb85bc6-49f1-46be-91c8-d814b48c2a99", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1818136803-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "00d4b993c13b46ea8f80b0caed60a373", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap61df4fc4-1d", "ovs_interfaceid": "61df4fc4-1d98-4d7b-b74e-280c61eac6ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Converting VIF {"id": "61df4fc4-1d98-4d7b-b74e-280c61eac6ee", "address": "fa:16:3e:87:40:bf", "network": {"id": "1cb85bc6-49f1-46be-91c8-d814b48c2a99", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1818136803-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "00d4b993c13b46ea8f80b0caed60a373", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap61df4fc4-1d", "ovs_interfaceid": "61df4fc4-1d98-4d7b-b74e-280c61eac6ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:87:40:bf,bridge_name='br-int',has_traffic_filtering=True,id=61df4fc4-1d98-4d7b-b74e-280c61eac6ee,network=Network(1cb85bc6-49f1-46be-91c8-d814b48c2a99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61df4fc4-1d') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG os_vif [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:40:bf,bridge_name='br-int',has_traffic_filtering=True,id=61df4fc4-1d98-4d7b-b74e-280c61eac6ee,network=Network(1cb85bc6-49f1-46be-91c8-d814b48c2a99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61df4fc4-1d') {{(pid=70975) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap61df4fc4-1d, may_exist=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap61df4fc4-1d, col_values=(('external_ids', {'iface-id': '61df4fc4-1d98-4d7b-b74e-280c61eac6ee', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:87:40:bf', 'vm-uuid': '776d1402-3e8a-407d-a20d-db46c1a21b23'}),)) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:01 user nova-compute[70975]: INFO os_vif [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:40:bf,bridge_name='br-int',has_traffic_filtering=True,id=61df4fc4-1d98-4d7b-b74e-280c61eac6ee,network=Network(1cb85bc6-49f1-46be-91c8-d814b48c2a99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61df4fc4-1d') Apr 18 16:24:01 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] No BDM found with device name vda, not building metadata. {{(pid=70975) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] No VIF found with MAC fa:16:3e:87:40:bf, not building metadata {{(pid=70975) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG nova.network.neutron [req-277ad55d-0d5f-4a29-93c1-cd218a58ad67 req-d0ea3c0a-9602-4101-a805-24bf7649a6e6 service nova] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Updated VIF entry in instance network info cache for port 61df4fc4-1d98-4d7b-b74e-280c61eac6ee. {{(pid=70975) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG nova.network.neutron [req-277ad55d-0d5f-4a29-93c1-cd218a58ad67 req-d0ea3c0a-9602-4101-a805-24bf7649a6e6 service nova] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Updating instance_info_cache with network_info: [{"id": "61df4fc4-1d98-4d7b-b74e-280c61eac6ee", "address": "fa:16:3e:87:40:bf", "network": {"id": "1cb85bc6-49f1-46be-91c8-d814b48c2a99", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1818136803-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "00d4b993c13b46ea8f80b0caed60a373", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap61df4fc4-1d", "ovs_interfaceid": "61df4fc4-1d98-4d7b-b74e-280c61eac6ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:24:01 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-277ad55d-0d5f-4a29-93c1-cd218a58ad67 req-d0ea3c0a-9602-4101-a805-24bf7649a6e6 service nova] Releasing lock "refresh_cache-776d1402-3e8a-407d-a20d-db46c1a21b23" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:24:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:02 user nova-compute[70975]: DEBUG nova.compute.manager [req-50c360fd-f9ca-4ac3-a5bf-95f6526679b5 req-5ef00d67-344f-4a04-a895-35d3453fff86 service nova] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Received event network-vif-plugged-61df4fc4-1d98-4d7b-b74e-280c61eac6ee {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:24:02 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-50c360fd-f9ca-4ac3-a5bf-95f6526679b5 req-5ef00d67-344f-4a04-a895-35d3453fff86 service nova] Acquiring lock "776d1402-3e8a-407d-a20d-db46c1a21b23-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:24:02 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-50c360fd-f9ca-4ac3-a5bf-95f6526679b5 req-5ef00d67-344f-4a04-a895-35d3453fff86 service nova] Lock "776d1402-3e8a-407d-a20d-db46c1a21b23-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:24:02 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-50c360fd-f9ca-4ac3-a5bf-95f6526679b5 req-5ef00d67-344f-4a04-a895-35d3453fff86 service nova] Lock "776d1402-3e8a-407d-a20d-db46c1a21b23-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:24:02 user nova-compute[70975]: DEBUG nova.compute.manager [req-50c360fd-f9ca-4ac3-a5bf-95f6526679b5 req-5ef00d67-344f-4a04-a895-35d3453fff86 service nova] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] No waiting events found dispatching network-vif-plugged-61df4fc4-1d98-4d7b-b74e-280c61eac6ee {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:24:02 user nova-compute[70975]: WARNING nova.compute.manager [req-50c360fd-f9ca-4ac3-a5bf-95f6526679b5 req-5ef00d67-344f-4a04-a895-35d3453fff86 service nova] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Received unexpected event network-vif-plugged-61df4fc4-1d98-4d7b-b74e-280c61eac6ee for instance with vm_state building and task_state spawning. Apr 18 16:24:03 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:03 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:03 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:03 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:04 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Resumed> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:24:04 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] VM Resumed (Lifecycle Event) Apr 18 16:24:04 user nova-compute[70975]: DEBUG nova.compute.manager [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Instance event wait completed in 0 seconds for {{(pid=70975) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 18 16:24:04 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Guest created on hypervisor {{(pid=70975) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 18 16:24:04 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Instance spawned successfully. Apr 18 16:24:04 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 18 16:24:04 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:24:04 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:24:04 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:24:04 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:24:04 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:24:04 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70975) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 18 16:24:04 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:24:04 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Found default for hw_cdrom_bus of ide {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:24:04 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Found default for hw_disk_bus of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:24:04 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Found default for hw_input_bus of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:24:04 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Found default for hw_pointer_model of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:24:04 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Found default for hw_video_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:24:04 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Found default for hw_vif_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:24:04 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:24:04 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Started> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:24:04 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] VM Started (Lifecycle Event) Apr 18 16:24:04 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:24:04 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:24:04 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:24:04 user nova-compute[70975]: INFO nova.compute.manager [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Took 5.72 seconds to spawn the instance on the hypervisor. Apr 18 16:24:04 user nova-compute[70975]: DEBUG nova.compute.manager [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:24:04 user nova-compute[70975]: DEBUG nova.compute.manager [req-a190f1f3-6f37-4952-b3a0-4039112952d3 req-246da944-4603-4450-a827-65b0b370d8b9 service nova] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Received event network-vif-plugged-61df4fc4-1d98-4d7b-b74e-280c61eac6ee {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:24:04 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-a190f1f3-6f37-4952-b3a0-4039112952d3 req-246da944-4603-4450-a827-65b0b370d8b9 service nova] Acquiring lock "776d1402-3e8a-407d-a20d-db46c1a21b23-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:24:04 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-a190f1f3-6f37-4952-b3a0-4039112952d3 req-246da944-4603-4450-a827-65b0b370d8b9 service nova] Lock "776d1402-3e8a-407d-a20d-db46c1a21b23-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:24:04 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-a190f1f3-6f37-4952-b3a0-4039112952d3 req-246da944-4603-4450-a827-65b0b370d8b9 service nova] Lock "776d1402-3e8a-407d-a20d-db46c1a21b23-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:24:04 user nova-compute[70975]: DEBUG nova.compute.manager [req-a190f1f3-6f37-4952-b3a0-4039112952d3 req-246da944-4603-4450-a827-65b0b370d8b9 service nova] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] No waiting events found dispatching network-vif-plugged-61df4fc4-1d98-4d7b-b74e-280c61eac6ee {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:24:04 user nova-compute[70975]: WARNING nova.compute.manager [req-a190f1f3-6f37-4952-b3a0-4039112952d3 req-246da944-4603-4450-a827-65b0b370d8b9 service nova] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Received unexpected event network-vif-plugged-61df4fc4-1d98-4d7b-b74e-280c61eac6ee for instance with vm_state building and task_state spawning. Apr 18 16:24:04 user nova-compute[70975]: INFO nova.compute.manager [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Took 6.34 seconds to build instance. Apr 18 16:24:04 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-3ce238d2-be81-4537-99b2-f4d180f423a9 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lock "776d1402-3e8a-407d-a20d-db46c1a21b23" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.449s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:24:05 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:24:05 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Starting heal instance info cache {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 18 16:24:05 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "refresh_cache-66df9389-d007-4737-8bb1-55bcb5f227ff" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:24:05 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquired lock "refresh_cache-66df9389-d007-4737-8bb1-55bcb5f227ff" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:24:05 user nova-compute[70975]: DEBUG nova.network.neutron [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Forcefully refreshing network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 18 16:24:05 user nova-compute[70975]: DEBUG nova.network.neutron [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Updating instance_info_cache with network_info: [{"id": "b66d41ab-873c-4826-a3f8-d4f4276fff10", "address": "fa:16:3e:58:32:25", "network": {"id": "236fa8aa-433b-4dfa-a787-f165c3389489", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1486162327-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5695adbb14ea4162bc40547b1509a1e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb66d41ab-87", "ovs_interfaceid": "b66d41ab-873c-4826-a3f8-d4f4276fff10", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:24:05 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Releasing lock "refresh_cache-66df9389-d007-4737-8bb1-55bcb5f227ff" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:24:05 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Updated the network info_cache for instance {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 18 16:24:05 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:24:06 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:06 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:24:06 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager.update_available_resource {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:24:06 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:24:06 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:24:06 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:24:06 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Auditing locally available compute resources for user (node: user) {{(pid=70975) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 18 16:24:06 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:24:06 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:24:06 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:24:06 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json" returned: 0 in 0.160s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:24:06 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6aece7dd-d545-4e26-9cb7-30ee0b01ebb2/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:24:07 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6aece7dd-d545-4e26-9cb7-30ee0b01ebb2/disk --force-share --output=json" returned: 0 in 0.151s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:24:07 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6aece7dd-d545-4e26-9cb7-30ee0b01ebb2/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:24:07 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6aece7dd-d545-4e26-9cb7-30ee0b01ebb2/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:24:07 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/776d1402-3e8a-407d-a20d-db46c1a21b23/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:24:07 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/776d1402-3e8a-407d-a20d-db46c1a21b23/disk --force-share --output=json" returned: 0 in 0.148s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:24:07 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/776d1402-3e8a-407d-a20d-db46c1a21b23/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:24:07 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/776d1402-3e8a-407d-a20d-db46c1a21b23/disk --force-share --output=json" returned: 0 in 0.165s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:24:07 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a4febff2-74e8-47ef-820d-f407a4d22d9d/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:24:07 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:07 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a4febff2-74e8-47ef-820d-f407a4d22d9d/disk --force-share --output=json" returned: 0 in 0.151s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:24:07 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a4febff2-74e8-47ef-820d-f407a4d22d9d/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:24:07 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a4febff2-74e8-47ef-820d-f407a4d22d9d/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:24:07 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:24:07 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:24:07 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:24:08 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:24:08 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:24:08 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:24:08 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Hypervisor/Node resource view: name=user free_ram=8677MB free_disk=26.457569122314453GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70975) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 18 16:24:08 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:24:08 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:24:08 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 6528f05a-9f05-4f35-b991-687e4f47029e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:24:08 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 66df9389-d007-4737-8bb1-55bcb5f227ff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:24:08 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:24:08 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance a4febff2-74e8-47ef-820d-f407a4d22d9d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:24:08 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 776d1402-3e8a-407d-a20d-db46c1a21b23 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:24:08 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Total usable vcpus: 12, total allocated vcpus: 5 {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 18 16:24:08 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Final resource view: name=user phys_ram=16023MB used_ram=1152MB phys_disk=40GB used_disk=5GB total_vcpus=12 used_vcpus=5 pci_stats=[] {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 18 16:24:08 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:24:08 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:24:08 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Compute_service record updated for user:user {{(pid=70975) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 18 16:24:08 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.325s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:24:09 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:24:11 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:12 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:12 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:13 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:24:16 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:16 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:20 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:21 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:22 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:23 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:26 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:27 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:31 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:32 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:32 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:33 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-7167605d-f6c0-4a3e-9611-d79ec436cbe2 tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Acquiring lock "a4febff2-74e8-47ef-820d-f407a4d22d9d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:24:33 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-7167605d-f6c0-4a3e-9611-d79ec436cbe2 tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lock "a4febff2-74e8-47ef-820d-f407a4d22d9d" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:24:33 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-7167605d-f6c0-4a3e-9611-d79ec436cbe2 tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Acquiring lock "a4febff2-74e8-47ef-820d-f407a4d22d9d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:24:33 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-7167605d-f6c0-4a3e-9611-d79ec436cbe2 tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lock "a4febff2-74e8-47ef-820d-f407a4d22d9d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:24:33 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-7167605d-f6c0-4a3e-9611-d79ec436cbe2 tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lock "a4febff2-74e8-47ef-820d-f407a4d22d9d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:24:33 user nova-compute[70975]: INFO nova.compute.manager [None req-7167605d-f6c0-4a3e-9611-d79ec436cbe2 tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Terminating instance Apr 18 16:24:33 user nova-compute[70975]: DEBUG nova.compute.manager [None req-7167605d-f6c0-4a3e-9611-d79ec436cbe2 tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Start destroying the instance on the hypervisor. {{(pid=70975) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 18 16:24:33 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:33 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:33 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:33 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:33 user nova-compute[70975]: DEBUG nova.compute.manager [req-81435a4a-deda-4558-9286-c61db5691100 req-105fac50-9111-4744-8c97-bc301f3c73cf service nova] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Received event network-vif-unplugged-5e7e767e-18ff-4103-8ea8-ce2a0375d42e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:24:33 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-81435a4a-deda-4558-9286-c61db5691100 req-105fac50-9111-4744-8c97-bc301f3c73cf service nova] Acquiring lock "a4febff2-74e8-47ef-820d-f407a4d22d9d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:24:33 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-81435a4a-deda-4558-9286-c61db5691100 req-105fac50-9111-4744-8c97-bc301f3c73cf service nova] Lock "a4febff2-74e8-47ef-820d-f407a4d22d9d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:24:33 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-81435a4a-deda-4558-9286-c61db5691100 req-105fac50-9111-4744-8c97-bc301f3c73cf service nova] Lock "a4febff2-74e8-47ef-820d-f407a4d22d9d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:24:33 user nova-compute[70975]: DEBUG nova.compute.manager [req-81435a4a-deda-4558-9286-c61db5691100 req-105fac50-9111-4744-8c97-bc301f3c73cf service nova] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] No waiting events found dispatching network-vif-unplugged-5e7e767e-18ff-4103-8ea8-ce2a0375d42e {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:24:33 user nova-compute[70975]: DEBUG nova.compute.manager [req-81435a4a-deda-4558-9286-c61db5691100 req-105fac50-9111-4744-8c97-bc301f3c73cf service nova] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Received event network-vif-unplugged-5e7e767e-18ff-4103-8ea8-ce2a0375d42e for instance with task_state deleting. {{(pid=70975) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 18 16:24:33 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Instance destroyed successfully. Apr 18 16:24:33 user nova-compute[70975]: DEBUG nova.objects.instance [None req-7167605d-f6c0-4a3e-9611-d79ec436cbe2 tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lazy-loading 'resources' on Instance uuid a4febff2-74e8-47ef-820d-f407a4d22d9d {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:24:33 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-7167605d-f6c0-4a3e-9611-d79ec436cbe2 tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:22:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1167960863',display_name='tempest-ServersNegativeTestJSON-server-1167960863',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1167960863',id=22,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-18T16:22:50Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='5695adbb14ea4162bc40547b1509a1e4',ramdisk_id='',reservation_id='r-lgt4t5zg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServersNegativeTestJSON-1696086909',owner_user_name='tempest-ServersNegativeTestJSON-1696086909-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-18T16:22:51Z,user_data=None,user_id='2963911de4f34d79816a9a1f9ad24a27',uuid=a4febff2-74e8-47ef-820d-f407a4d22d9d,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5e7e767e-18ff-4103-8ea8-ce2a0375d42e", "address": "fa:16:3e:32:2a:5e", "network": {"id": "236fa8aa-433b-4dfa-a787-f165c3389489", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1486162327-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5695adbb14ea4162bc40547b1509a1e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e7e767e-18", "ovs_interfaceid": "5e7e767e-18ff-4103-8ea8-ce2a0375d42e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 18 16:24:33 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-7167605d-f6c0-4a3e-9611-d79ec436cbe2 tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Converting VIF {"id": "5e7e767e-18ff-4103-8ea8-ce2a0375d42e", "address": "fa:16:3e:32:2a:5e", "network": {"id": "236fa8aa-433b-4dfa-a787-f165c3389489", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1486162327-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5695adbb14ea4162bc40547b1509a1e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e7e767e-18", "ovs_interfaceid": "5e7e767e-18ff-4103-8ea8-ce2a0375d42e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:24:33 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-7167605d-f6c0-4a3e-9611-d79ec436cbe2 tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:32:2a:5e,bridge_name='br-int',has_traffic_filtering=True,id=5e7e767e-18ff-4103-8ea8-ce2a0375d42e,network=Network(236fa8aa-433b-4dfa-a787-f165c3389489),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e7e767e-18') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:24:33 user nova-compute[70975]: DEBUG os_vif [None req-7167605d-f6c0-4a3e-9611-d79ec436cbe2 tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:2a:5e,bridge_name='br-int',has_traffic_filtering=True,id=5e7e767e-18ff-4103-8ea8-ce2a0375d42e,network=Network(236fa8aa-433b-4dfa-a787-f165c3389489),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e7e767e-18') {{(pid=70975) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 18 16:24:33 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:33 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5e7e767e-18, bridge=br-int, if_exists=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:24:33 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:33 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:24:33 user nova-compute[70975]: INFO os_vif [None req-7167605d-f6c0-4a3e-9611-d79ec436cbe2 tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:2a:5e,bridge_name='br-int',has_traffic_filtering=True,id=5e7e767e-18ff-4103-8ea8-ce2a0375d42e,network=Network(236fa8aa-433b-4dfa-a787-f165c3389489),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e7e767e-18') Apr 18 16:24:33 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-7167605d-f6c0-4a3e-9611-d79ec436cbe2 tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Deleting instance files /opt/stack/data/nova/instances/a4febff2-74e8-47ef-820d-f407a4d22d9d_del Apr 18 16:24:33 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-7167605d-f6c0-4a3e-9611-d79ec436cbe2 tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Deletion of /opt/stack/data/nova/instances/a4febff2-74e8-47ef-820d-f407a4d22d9d_del complete Apr 18 16:24:33 user nova-compute[70975]: INFO nova.compute.manager [None req-7167605d-f6c0-4a3e-9611-d79ec436cbe2 tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Took 0.65 seconds to destroy the instance on the hypervisor. Apr 18 16:24:33 user nova-compute[70975]: DEBUG oslo.service.loopingcall [None req-7167605d-f6c0-4a3e-9611-d79ec436cbe2 tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70975) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 18 16:24:33 user nova-compute[70975]: DEBUG nova.compute.manager [-] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Deallocating network for instance {{(pid=70975) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 18 16:24:33 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] deallocate_for_instance() {{(pid=70975) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 18 16:24:34 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Updating instance_info_cache with network_info: [] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:24:34 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Took 0.52 seconds to deallocate network for instance. Apr 18 16:24:34 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-7167605d-f6c0-4a3e-9611-d79ec436cbe2 tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:24:34 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-7167605d-f6c0-4a3e-9611-d79ec436cbe2 tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:24:34 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-7167605d-f6c0-4a3e-9611-d79ec436cbe2 tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:24:34 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-7167605d-f6c0-4a3e-9611-d79ec436cbe2 tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:24:34 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-7167605d-f6c0-4a3e-9611-d79ec436cbe2 tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.282s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:24:34 user nova-compute[70975]: INFO nova.scheduler.client.report [None req-7167605d-f6c0-4a3e-9611-d79ec436cbe2 tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Deleted allocations for instance a4febff2-74e8-47ef-820d-f407a4d22d9d Apr 18 16:24:34 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-7167605d-f6c0-4a3e-9611-d79ec436cbe2 tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lock "a4febff2-74e8-47ef-820d-f407a4d22d9d" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.634s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:24:35 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:35 user nova-compute[70975]: DEBUG nova.compute.manager [req-35643683-3233-4e4d-abf9-ed4465b0a7ab req-4995bf58-5fde-4587-917a-6c451d91f271 service nova] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Received event network-vif-plugged-5e7e767e-18ff-4103-8ea8-ce2a0375d42e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:24:35 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-35643683-3233-4e4d-abf9-ed4465b0a7ab req-4995bf58-5fde-4587-917a-6c451d91f271 service nova] Acquiring lock "a4febff2-74e8-47ef-820d-f407a4d22d9d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:24:35 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-35643683-3233-4e4d-abf9-ed4465b0a7ab req-4995bf58-5fde-4587-917a-6c451d91f271 service nova] Lock "a4febff2-74e8-47ef-820d-f407a4d22d9d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:24:35 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-35643683-3233-4e4d-abf9-ed4465b0a7ab req-4995bf58-5fde-4587-917a-6c451d91f271 service nova] Lock "a4febff2-74e8-47ef-820d-f407a4d22d9d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:24:35 user nova-compute[70975]: DEBUG nova.compute.manager [req-35643683-3233-4e4d-abf9-ed4465b0a7ab req-4995bf58-5fde-4587-917a-6c451d91f271 service nova] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] No waiting events found dispatching network-vif-plugged-5e7e767e-18ff-4103-8ea8-ce2a0375d42e {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:24:35 user nova-compute[70975]: WARNING nova.compute.manager [req-35643683-3233-4e4d-abf9-ed4465b0a7ab req-4995bf58-5fde-4587-917a-6c451d91f271 service nova] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Received unexpected event network-vif-plugged-5e7e767e-18ff-4103-8ea8-ce2a0375d42e for instance with vm_state deleted and task_state None. Apr 18 16:24:35 user nova-compute[70975]: DEBUG nova.compute.manager [req-35643683-3233-4e4d-abf9-ed4465b0a7ab req-4995bf58-5fde-4587-917a-6c451d91f271 service nova] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Received event network-vif-deleted-5e7e767e-18ff-4103-8ea8-ce2a0375d42e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:24:37 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:38 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:40 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:43 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:47 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Acquiring lock "50137f3d-a6c4-4ac5-8edb-fd5941f4e43b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:24:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "50137f3d-a6c4-4ac5-8edb-fd5941f4e43b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:24:48 user nova-compute[70975]: DEBUG nova.compute.manager [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] Starting instance... {{(pid=70975) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 18 16:24:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:24:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:24:48 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70975) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 18 16:24:48 user nova-compute[70975]: INFO nova.compute.claims [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] Claim successful on node user Apr 18 16:24:48 user nova-compute[70975]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:24:48 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] VM Stopped (Lifecycle Event) Apr 18 16:24:48 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:48 user nova-compute[70975]: DEBUG nova.compute.manager [None req-93b718c3-5e49-4c82-9cd4-e580f2ac5ce9 None None] [instance: a4febff2-74e8-47ef-820d-f407a4d22d9d] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:24:49 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:24:49 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:24:49 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.276s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:24:49 user nova-compute[70975]: DEBUG nova.compute.manager [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] Start building networks asynchronously for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 18 16:24:49 user nova-compute[70975]: DEBUG nova.compute.manager [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] Allocating IP information in the background. {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 18 16:24:49 user nova-compute[70975]: DEBUG nova.network.neutron [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] allocate_for_instance() {{(pid=70975) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 18 16:24:49 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 18 16:24:49 user nova-compute[70975]: DEBUG nova.compute.manager [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] Start building block device mappings for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 18 16:24:49 user nova-compute[70975]: INFO nova.virt.block_device [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] Booting with volume-backed-image b11a20de-f82a-4158-b53e-0a0c7a1552cb at /dev/vda Apr 18 16:24:49 user nova-compute[70975]: DEBUG nova.policy [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c54c277689214bd0a2cadb1e2ac288a9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f516f5ec45ca4508841c77f79e8c038b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70975) authorize /opt/stack/nova/nova/policy.py:203}} Apr 18 16:24:49 user nova-compute[70975]: DEBUG nova.network.neutron [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] Successfully created port: b276194f-7dad-4a54-9b79-874ee97bffc0 {{(pid=70975) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 18 16:24:50 user nova-compute[70975]: DEBUG nova.network.neutron [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] Successfully updated port: b276194f-7dad-4a54-9b79-874ee97bffc0 {{(pid=70975) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 18 16:24:50 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Acquiring lock "refresh_cache-50137f3d-a6c4-4ac5-8edb-fd5941f4e43b" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:24:50 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Acquired lock "refresh_cache-50137f3d-a6c4-4ac5-8edb-fd5941f4e43b" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:24:50 user nova-compute[70975]: DEBUG nova.network.neutron [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] Building network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 18 16:24:50 user nova-compute[70975]: DEBUG nova.compute.manager [req-7098cf00-a929-4e51-b3cd-236bf4fe51fc req-1ecbee5d-8ee8-419e-a3f1-9f26efa54e09 service nova] [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] Received event network-changed-b276194f-7dad-4a54-9b79-874ee97bffc0 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:24:50 user nova-compute[70975]: DEBUG nova.compute.manager [req-7098cf00-a929-4e51-b3cd-236bf4fe51fc req-1ecbee5d-8ee8-419e-a3f1-9f26efa54e09 service nova] [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] Refreshing instance network info cache due to event network-changed-b276194f-7dad-4a54-9b79-874ee97bffc0. {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 18 16:24:50 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-7098cf00-a929-4e51-b3cd-236bf4fe51fc req-1ecbee5d-8ee8-419e-a3f1-9f26efa54e09 service nova] Acquiring lock "refresh_cache-50137f3d-a6c4-4ac5-8edb-fd5941f4e43b" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:24:50 user nova-compute[70975]: DEBUG nova.network.neutron [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] Instance cache missing network info. {{(pid=70975) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 18 16:24:51 user nova-compute[70975]: DEBUG nova.network.neutron [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] Updating instance_info_cache with network_info: [{"id": "b276194f-7dad-4a54-9b79-874ee97bffc0", "address": "fa:16:3e:88:83:21", "network": {"id": "923d10dc-c67e-4426-9c6e-856e903e2446", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-131274818-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f516f5ec45ca4508841c77f79e8c038b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb276194f-7d", "ovs_interfaceid": "b276194f-7dad-4a54-9b79-874ee97bffc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:24:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Releasing lock "refresh_cache-50137f3d-a6c4-4ac5-8edb-fd5941f4e43b" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:24:51 user nova-compute[70975]: DEBUG nova.compute.manager [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] Instance network_info: |[{"id": "b276194f-7dad-4a54-9b79-874ee97bffc0", "address": "fa:16:3e:88:83:21", "network": {"id": "923d10dc-c67e-4426-9c6e-856e903e2446", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-131274818-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f516f5ec45ca4508841c77f79e8c038b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb276194f-7d", "ovs_interfaceid": "b276194f-7dad-4a54-9b79-874ee97bffc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 18 16:24:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-7098cf00-a929-4e51-b3cd-236bf4fe51fc req-1ecbee5d-8ee8-419e-a3f1-9f26efa54e09 service nova] Acquired lock "refresh_cache-50137f3d-a6c4-4ac5-8edb-fd5941f4e43b" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:24:51 user nova-compute[70975]: DEBUG nova.network.neutron [req-7098cf00-a929-4e51-b3cd-236bf4fe51fc req-1ecbee5d-8ee8-419e-a3f1-9f26efa54e09 service nova] [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] Refreshing network info cache for port b276194f-7dad-4a54-9b79-874ee97bffc0 {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 18 16:24:51 user nova-compute[70975]: DEBUG nova.network.neutron [req-7098cf00-a929-4e51-b3cd-236bf4fe51fc req-1ecbee5d-8ee8-419e-a3f1-9f26efa54e09 service nova] [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] Updated VIF entry in instance network info cache for port b276194f-7dad-4a54-9b79-874ee97bffc0. {{(pid=70975) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 18 16:24:51 user nova-compute[70975]: DEBUG nova.network.neutron [req-7098cf00-a929-4e51-b3cd-236bf4fe51fc req-1ecbee5d-8ee8-419e-a3f1-9f26efa54e09 service nova] [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] Updating instance_info_cache with network_info: [{"id": "b276194f-7dad-4a54-9b79-874ee97bffc0", "address": "fa:16:3e:88:83:21", "network": {"id": "923d10dc-c67e-4426-9c6e-856e903e2446", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-131274818-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f516f5ec45ca4508841c77f79e8c038b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb276194f-7d", "ovs_interfaceid": "b276194f-7dad-4a54-9b79-874ee97bffc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:24:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-7098cf00-a929-4e51-b3cd-236bf4fe51fc req-1ecbee5d-8ee8-419e-a3f1-9f26efa54e09 service nova] Releasing lock "refresh_cache-50137f3d-a6c4-4ac5-8edb-fd5941f4e43b" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:24:53 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:54 user nova-compute[70975]: WARNING nova.compute.manager [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Volume id: c10365c8-d370-4220-b2f1-57fb1d8c5adc finished being created but its status is error. Apr 18 16:24:54 user nova-compute[70975]: ERROR nova.compute.manager [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] Instance failed block device setup: nova.exception.VolumeNotCreated: Volume c10365c8-d370-4220-b2f1-57fb1d8c5adc did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error. Apr 18 16:24:54 user nova-compute[70975]: ERROR nova.compute.manager [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] Traceback (most recent call last): Apr 18 16:24:54 user nova-compute[70975]: ERROR nova.compute.manager [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] File "/opt/stack/nova/nova/compute/manager.py", line 2175, in _prep_block_device Apr 18 16:24:54 user nova-compute[70975]: ERROR nova.compute.manager [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] driver_block_device.attach_block_devices( Apr 18 16:24:54 user nova-compute[70975]: ERROR nova.compute.manager [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] File "/opt/stack/nova/nova/virt/block_device.py", line 936, in attach_block_devices Apr 18 16:24:54 user nova-compute[70975]: ERROR nova.compute.manager [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] _log_and_attach(device) Apr 18 16:24:54 user nova-compute[70975]: ERROR nova.compute.manager [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] File "/opt/stack/nova/nova/virt/block_device.py", line 933, in _log_and_attach Apr 18 16:24:54 user nova-compute[70975]: ERROR nova.compute.manager [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] bdm.attach(*attach_args, **attach_kwargs) Apr 18 16:24:54 user nova-compute[70975]: ERROR nova.compute.manager [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] File "/opt/stack/nova/nova/virt/block_device.py", line 831, in attach Apr 18 16:24:54 user nova-compute[70975]: ERROR nova.compute.manager [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] self.volume_id, self.attachment_id = self._create_volume( Apr 18 16:24:54 user nova-compute[70975]: ERROR nova.compute.manager [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] File "/opt/stack/nova/nova/virt/block_device.py", line 435, in _create_volume Apr 18 16:24:54 user nova-compute[70975]: ERROR nova.compute.manager [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] self._call_wait_func(context, wait_func, volume_api, vol['id']) Apr 18 16:24:54 user nova-compute[70975]: ERROR nova.compute.manager [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] File "/opt/stack/nova/nova/virt/block_device.py", line 785, in _call_wait_func Apr 18 16:24:54 user nova-compute[70975]: ERROR nova.compute.manager [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] with excutils.save_and_reraise_exception(): Apr 18 16:24:54 user nova-compute[70975]: ERROR nova.compute.manager [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ Apr 18 16:24:54 user nova-compute[70975]: ERROR nova.compute.manager [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] self.force_reraise() Apr 18 16:24:54 user nova-compute[70975]: ERROR nova.compute.manager [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise Apr 18 16:24:54 user nova-compute[70975]: ERROR nova.compute.manager [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] raise self.value Apr 18 16:24:54 user nova-compute[70975]: ERROR nova.compute.manager [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] File "/opt/stack/nova/nova/virt/block_device.py", line 783, in _call_wait_func Apr 18 16:24:54 user nova-compute[70975]: ERROR nova.compute.manager [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] wait_func(context, volume_id) Apr 18 16:24:54 user nova-compute[70975]: ERROR nova.compute.manager [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] File "/opt/stack/nova/nova/compute/manager.py", line 1792, in _await_block_device_map_created Apr 18 16:24:54 user nova-compute[70975]: ERROR nova.compute.manager [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] raise exception.VolumeNotCreated(volume_id=vol_id, Apr 18 16:24:54 user nova-compute[70975]: ERROR nova.compute.manager [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] nova.exception.VolumeNotCreated: Volume c10365c8-d370-4220-b2f1-57fb1d8c5adc did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error. Apr 18 16:24:54 user nova-compute[70975]: ERROR nova.compute.manager [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] Apr 18 16:24:54 user nova-compute[70975]: DEBUG nova.compute.claims [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] Aborting claim: {{(pid=70975) abort /opt/stack/nova/nova/compute/claims.py:84}} Apr 18 16:24:54 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:24:54 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:24:54 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:24:54 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:24:54 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.236s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:24:54 user nova-compute[70975]: DEBUG nova.compute.manager [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] Build of instance 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b aborted: Volume c10365c8-d370-4220-b2f1-57fb1d8c5adc did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error. {{(pid=70975) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2636}} Apr 18 16:24:54 user nova-compute[70975]: DEBUG nova.compute.utils [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] Build of instance 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b aborted: Volume c10365c8-d370-4220-b2f1-57fb1d8c5adc did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error. {{(pid=70975) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} Apr 18 16:24:54 user nova-compute[70975]: ERROR nova.compute.manager [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] Build of instance 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b aborted: Volume c10365c8-d370-4220-b2f1-57fb1d8c5adc did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error.: nova.exception.BuildAbortException: Build of instance 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b aborted: Volume c10365c8-d370-4220-b2f1-57fb1d8c5adc did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error. Apr 18 16:24:54 user nova-compute[70975]: DEBUG nova.compute.manager [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] Unplugging VIFs for instance {{(pid=70975) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} Apr 18 16:24:54 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:24:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1790437573',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1790437573',id=24,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f516f5ec45ca4508841c77f79e8c038b',ramdisk_id='',reservation_id='r-wabd2hew',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-2021464272',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member'},tags=TagList,task_state='block_device_mapping',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:24:49Z,user_data=None,user_id='c54c277689214bd0a2cadb1e2ac288a9',uuid=50137f3d-a6c4-4ac5-8edb-fd5941f4e43b,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b276194f-7dad-4a54-9b79-874ee97bffc0", "address": "fa:16:3e:88:83:21", "network": {"id": "923d10dc-c67e-4426-9c6e-856e903e2446", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-131274818-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f516f5ec45ca4508841c77f79e8c038b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb276194f-7d", "ovs_interfaceid": "b276194f-7dad-4a54-9b79-874ee97bffc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 18 16:24:54 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Converting VIF {"id": "b276194f-7dad-4a54-9b79-874ee97bffc0", "address": "fa:16:3e:88:83:21", "network": {"id": "923d10dc-c67e-4426-9c6e-856e903e2446", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-131274818-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f516f5ec45ca4508841c77f79e8c038b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb276194f-7d", "ovs_interfaceid": "b276194f-7dad-4a54-9b79-874ee97bffc0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:24:54 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:88:83:21,bridge_name='br-int',has_traffic_filtering=True,id=b276194f-7dad-4a54-9b79-874ee97bffc0,network=Network(923d10dc-c67e-4426-9c6e-856e903e2446),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb276194f-7d') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:24:54 user nova-compute[70975]: DEBUG os_vif [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:88:83:21,bridge_name='br-int',has_traffic_filtering=True,id=b276194f-7dad-4a54-9b79-874ee97bffc0,network=Network(923d10dc-c67e-4426-9c6e-856e903e2446),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb276194f-7d') {{(pid=70975) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 18 16:24:54 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:24:54 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb276194f-7d, bridge=br-int, if_exists=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:24:54 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 18 16:24:54 user nova-compute[70975]: INFO os_vif [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:88:83:21,bridge_name='br-int',has_traffic_filtering=True,id=b276194f-7dad-4a54-9b79-874ee97bffc0,network=Network(923d10dc-c67e-4426-9c6e-856e903e2446),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb276194f-7d') Apr 18 16:24:54 user nova-compute[70975]: DEBUG nova.compute.manager [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] Unplugged VIFs for instance {{(pid=70975) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} Apr 18 16:24:54 user nova-compute[70975]: DEBUG nova.compute.manager [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] Deallocating network for instance {{(pid=70975) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 18 16:24:54 user nova-compute[70975]: DEBUG nova.network.neutron [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] deallocate_for_instance() {{(pid=70975) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 18 16:24:55 user nova-compute[70975]: DEBUG nova.network.neutron [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] Updating instance_info_cache with network_info: [] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:24:55 user nova-compute[70975]: INFO nova.compute.manager [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b] Took 0.56 seconds to deallocate network for instance. Apr 18 16:24:55 user nova-compute[70975]: INFO nova.scheduler.client.report [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Deleted allocations for instance 50137f3d-a6c4-4ac5-8edb-fd5941f4e43b Apr 18 16:24:55 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-ef4d3a05-460e-4caf-b497-003f15e39ce1 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "50137f3d-a6c4-4ac5-8edb-fd5941f4e43b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.986s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:24:58 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 3933-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:25:02 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:25:02 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:25:03 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:25:03 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70975) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 18 16:25:03 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:25:05 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:25:06 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:25:06 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager.update_available_resource {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:25:06 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:25:06 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:25:06 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:25:06 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Auditing locally available compute resources for user (node: user) {{(pid=70975) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 18 16:25:06 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:25:06 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:25:06 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:25:06 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json" returned: 0 in 0.127s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:25:06 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6aece7dd-d545-4e26-9cb7-30ee0b01ebb2/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:25:06 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6aece7dd-d545-4e26-9cb7-30ee0b01ebb2/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:25:06 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6aece7dd-d545-4e26-9cb7-30ee0b01ebb2/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:25:07 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6aece7dd-d545-4e26-9cb7-30ee0b01ebb2/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:25:07 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/776d1402-3e8a-407d-a20d-db46c1a21b23/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:25:07 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/776d1402-3e8a-407d-a20d-db46c1a21b23/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:25:07 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/776d1402-3e8a-407d-a20d-db46c1a21b23/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:25:07 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/776d1402-3e8a-407d-a20d-db46c1a21b23/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:25:07 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:25:07 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:25:07 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:25:07 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:25:08 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:25:08 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:25:08 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Hypervisor/Node resource view: name=user free_ram=8816MB free_disk=26.44647216796875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70975) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 18 16:25:08 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:25:08 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:25:08 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 6528f05a-9f05-4f35-b991-687e4f47029e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:25:08 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 66df9389-d007-4737-8bb1-55bcb5f227ff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:25:08 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:25:08 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 776d1402-3e8a-407d-a20d-db46c1a21b23 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:25:08 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Total usable vcpus: 12, total allocated vcpus: 4 {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 18 16:25:08 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Final resource view: name=user phys_ram=16023MB used_ram=1024MB phys_disk=40GB used_disk=4GB total_vcpus=12 used_vcpus=4 pci_stats=[] {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 18 16:25:08 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Refreshing inventories for resource provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 18 16:25:08 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Updating ProviderTree inventory for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 18 16:25:08 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Updating inventory in ProviderTree for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 18 16:25:08 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Refreshing aggregate associations for resource provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9, aggregates: None {{(pid=70975) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 18 16:25:08 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Refreshing trait associations for resource provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9, traits: COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE41,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_RESCUE_BFV,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_IDE,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE42 {{(pid=70975) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 18 16:25:08 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:25:08 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:25:08 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Compute_service record updated for user:user {{(pid=70975) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 18 16:25:08 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.464s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:25:08 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:25:08 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Cleaning up deleted instances with incomplete migration {{(pid=70975) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 18 16:25:08 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:25:09 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:25:09 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Starting heal instance info cache {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 18 16:25:09 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "refresh_cache-6aece7dd-d545-4e26-9cb7-30ee0b01ebb2" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:25:09 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquired lock "refresh_cache-6aece7dd-d545-4e26-9cb7-30ee0b01ebb2" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:25:09 user nova-compute[70975]: DEBUG nova.network.neutron [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Forcefully refreshing network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 18 16:25:09 user nova-compute[70975]: DEBUG nova.network.neutron [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Updating instance_info_cache with network_info: [{"id": "8d71cc71-9d8c-428d-ad04-69a31a967fe9", "address": "fa:16:3e:f6:64:6c", "network": {"id": "923d10dc-c67e-4426-9c6e-856e903e2446", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-131274818-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f516f5ec45ca4508841c77f79e8c038b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d71cc71-9d", "ovs_interfaceid": "8d71cc71-9d8c-428d-ad04-69a31a967fe9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:25:09 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Releasing lock "refresh_cache-6aece7dd-d545-4e26-9cb7-30ee0b01ebb2" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:25:09 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Updated the network info_cache for instance {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 18 16:25:09 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:25:09 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:25:09 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:25:10 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:25:10 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Cleaning up deleted instances {{(pid=70975) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 18 16:25:10 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] There are 0 instances to clean {{(pid=70975) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 18 16:25:13 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:25:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:25:18 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:25:23 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:25:27 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:25:28 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:25:33 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:25:38 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:25:39 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-7e5e4c70-5a78-4556-b16e-d2f033fbe1a7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Acquiring lock "6aece7dd-d545-4e26-9cb7-30ee0b01ebb2" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:25:39 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-7e5e4c70-5a78-4556-b16e-d2f033fbe1a7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "6aece7dd-d545-4e26-9cb7-30ee0b01ebb2" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:25:39 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-7e5e4c70-5a78-4556-b16e-d2f033fbe1a7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Acquiring lock "6aece7dd-d545-4e26-9cb7-30ee0b01ebb2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:25:39 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-7e5e4c70-5a78-4556-b16e-d2f033fbe1a7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "6aece7dd-d545-4e26-9cb7-30ee0b01ebb2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:25:39 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-7e5e4c70-5a78-4556-b16e-d2f033fbe1a7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "6aece7dd-d545-4e26-9cb7-30ee0b01ebb2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:25:39 user nova-compute[70975]: INFO nova.compute.manager [None req-7e5e4c70-5a78-4556-b16e-d2f033fbe1a7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Terminating instance Apr 18 16:25:39 user nova-compute[70975]: DEBUG nova.compute.manager [None req-7e5e4c70-5a78-4556-b16e-d2f033fbe1a7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Start destroying the instance on the hypervisor. {{(pid=70975) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 18 16:25:39 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:25:39 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:25:39 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:25:39 user nova-compute[70975]: DEBUG nova.compute.manager [req-11ad094a-40ba-4be6-92a3-d0e430786109 req-fb97c8ff-5f1f-4861-a2f0-62f996377a37 service nova] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Received event network-vif-unplugged-8d71cc71-9d8c-428d-ad04-69a31a967fe9 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:25:39 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-11ad094a-40ba-4be6-92a3-d0e430786109 req-fb97c8ff-5f1f-4861-a2f0-62f996377a37 service nova] Acquiring lock "6aece7dd-d545-4e26-9cb7-30ee0b01ebb2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:25:39 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-11ad094a-40ba-4be6-92a3-d0e430786109 req-fb97c8ff-5f1f-4861-a2f0-62f996377a37 service nova] Lock "6aece7dd-d545-4e26-9cb7-30ee0b01ebb2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:25:39 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-11ad094a-40ba-4be6-92a3-d0e430786109 req-fb97c8ff-5f1f-4861-a2f0-62f996377a37 service nova] Lock "6aece7dd-d545-4e26-9cb7-30ee0b01ebb2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:25:39 user nova-compute[70975]: DEBUG nova.compute.manager [req-11ad094a-40ba-4be6-92a3-d0e430786109 req-fb97c8ff-5f1f-4861-a2f0-62f996377a37 service nova] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] No waiting events found dispatching network-vif-unplugged-8d71cc71-9d8c-428d-ad04-69a31a967fe9 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:25:39 user nova-compute[70975]: DEBUG nova.compute.manager [req-11ad094a-40ba-4be6-92a3-d0e430786109 req-fb97c8ff-5f1f-4861-a2f0-62f996377a37 service nova] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Received event network-vif-unplugged-8d71cc71-9d8c-428d-ad04-69a31a967fe9 for instance with task_state deleting. {{(pid=70975) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 18 16:25:39 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Instance destroyed successfully. Apr 18 16:25:39 user nova-compute[70975]: DEBUG nova.objects.instance [None req-7e5e4c70-5a78-4556-b16e-d2f033fbe1a7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lazy-loading 'resources' on Instance uuid 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2 {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:25:39 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-7e5e4c70-5a78-4556-b16e-d2f033fbe1a7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:21:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1835828907',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1835828907',id=21,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-18T16:21:56Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='f516f5ec45ca4508841c77f79e8c038b',ramdisk_id='',reservation_id='r-jkqz8m09',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-2021464272',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-18T16:23:47Z,user_data=None,user_id='c54c277689214bd0a2cadb1e2ac288a9',uuid=6aece7dd-d545-4e26-9cb7-30ee0b01ebb2,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8d71cc71-9d8c-428d-ad04-69a31a967fe9", "address": "fa:16:3e:f6:64:6c", "network": {"id": "923d10dc-c67e-4426-9c6e-856e903e2446", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-131274818-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f516f5ec45ca4508841c77f79e8c038b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d71cc71-9d", "ovs_interfaceid": "8d71cc71-9d8c-428d-ad04-69a31a967fe9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 18 16:25:39 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-7e5e4c70-5a78-4556-b16e-d2f033fbe1a7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Converting VIF {"id": "8d71cc71-9d8c-428d-ad04-69a31a967fe9", "address": "fa:16:3e:f6:64:6c", "network": {"id": "923d10dc-c67e-4426-9c6e-856e903e2446", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-131274818-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f516f5ec45ca4508841c77f79e8c038b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d71cc71-9d", "ovs_interfaceid": "8d71cc71-9d8c-428d-ad04-69a31a967fe9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:25:39 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-7e5e4c70-5a78-4556-b16e-d2f033fbe1a7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f6:64:6c,bridge_name='br-int',has_traffic_filtering=True,id=8d71cc71-9d8c-428d-ad04-69a31a967fe9,network=Network(923d10dc-c67e-4426-9c6e-856e903e2446),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8d71cc71-9d') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:25:39 user nova-compute[70975]: DEBUG os_vif [None req-7e5e4c70-5a78-4556-b16e-d2f033fbe1a7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f6:64:6c,bridge_name='br-int',has_traffic_filtering=True,id=8d71cc71-9d8c-428d-ad04-69a31a967fe9,network=Network(923d10dc-c67e-4426-9c6e-856e903e2446),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8d71cc71-9d') {{(pid=70975) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 18 16:25:39 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:25:39 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8d71cc71-9d, bridge=br-int, if_exists=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:25:39 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:25:39 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:25:39 user nova-compute[70975]: INFO os_vif [None req-7e5e4c70-5a78-4556-b16e-d2f033fbe1a7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f6:64:6c,bridge_name='br-int',has_traffic_filtering=True,id=8d71cc71-9d8c-428d-ad04-69a31a967fe9,network=Network(923d10dc-c67e-4426-9c6e-856e903e2446),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8d71cc71-9d') Apr 18 16:25:39 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-7e5e4c70-5a78-4556-b16e-d2f033fbe1a7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Deleting instance files /opt/stack/data/nova/instances/6aece7dd-d545-4e26-9cb7-30ee0b01ebb2_del Apr 18 16:25:39 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-7e5e4c70-5a78-4556-b16e-d2f033fbe1a7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Deletion of /opt/stack/data/nova/instances/6aece7dd-d545-4e26-9cb7-30ee0b01ebb2_del complete Apr 18 16:25:39 user nova-compute[70975]: INFO nova.compute.manager [None req-7e5e4c70-5a78-4556-b16e-d2f033fbe1a7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Took 0.66 seconds to destroy the instance on the hypervisor. Apr 18 16:25:39 user nova-compute[70975]: DEBUG oslo.service.loopingcall [None req-7e5e4c70-5a78-4556-b16e-d2f033fbe1a7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70975) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 18 16:25:39 user nova-compute[70975]: DEBUG nova.compute.manager [-] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Deallocating network for instance {{(pid=70975) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 18 16:25:39 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] deallocate_for_instance() {{(pid=70975) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 18 16:25:40 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Updating instance_info_cache with network_info: [] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:25:40 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Took 0.63 seconds to deallocate network for instance. Apr 18 16:25:40 user nova-compute[70975]: DEBUG nova.compute.manager [req-d3e7bf06-fedb-449f-95a4-7f3a0993505b req-49e3decd-d600-4497-98d8-17b267dc0216 service nova] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Received event network-vif-deleted-8d71cc71-9d8c-428d-ad04-69a31a967fe9 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:25:40 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-7e5e4c70-5a78-4556-b16e-d2f033fbe1a7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:25:40 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-7e5e4c70-5a78-4556-b16e-d2f033fbe1a7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:25:40 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-7e5e4c70-5a78-4556-b16e-d2f033fbe1a7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:25:40 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-7e5e4c70-5a78-4556-b16e-d2f033fbe1a7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:25:40 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-7e5e4c70-5a78-4556-b16e-d2f033fbe1a7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.169s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:25:40 user nova-compute[70975]: INFO nova.scheduler.client.report [None req-7e5e4c70-5a78-4556-b16e-d2f033fbe1a7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Deleted allocations for instance 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2 Apr 18 16:25:40 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-7e5e4c70-5a78-4556-b16e-d2f033fbe1a7 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "6aece7dd-d545-4e26-9cb7-30ee0b01ebb2" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.633s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:25:41 user nova-compute[70975]: DEBUG nova.compute.manager [req-277daca2-5e47-4581-865e-8149c6022f17 req-b20b6e66-dcfc-465d-8c04-c485ba1ed482 service nova] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Received event network-vif-plugged-8d71cc71-9d8c-428d-ad04-69a31a967fe9 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:25:41 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-277daca2-5e47-4581-865e-8149c6022f17 req-b20b6e66-dcfc-465d-8c04-c485ba1ed482 service nova] Acquiring lock "6aece7dd-d545-4e26-9cb7-30ee0b01ebb2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:25:41 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-277daca2-5e47-4581-865e-8149c6022f17 req-b20b6e66-dcfc-465d-8c04-c485ba1ed482 service nova] Lock "6aece7dd-d545-4e26-9cb7-30ee0b01ebb2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:25:41 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-277daca2-5e47-4581-865e-8149c6022f17 req-b20b6e66-dcfc-465d-8c04-c485ba1ed482 service nova] Lock "6aece7dd-d545-4e26-9cb7-30ee0b01ebb2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:25:41 user nova-compute[70975]: DEBUG nova.compute.manager [req-277daca2-5e47-4581-865e-8149c6022f17 req-b20b6e66-dcfc-465d-8c04-c485ba1ed482 service nova] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] No waiting events found dispatching network-vif-plugged-8d71cc71-9d8c-428d-ad04-69a31a967fe9 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:25:41 user nova-compute[70975]: WARNING nova.compute.manager [req-277daca2-5e47-4581-865e-8149c6022f17 req-b20b6e66-dcfc-465d-8c04-c485ba1ed482 service nova] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Received unexpected event network-vif-plugged-8d71cc71-9d8c-428d-ad04-69a31a967fe9 for instance with vm_state deleted and task_state None. Apr 18 16:25:44 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:25:49 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:25:50 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-64ed6fdb-c332-4a79-88a0-0684cf7144a8 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Acquiring lock "776d1402-3e8a-407d-a20d-db46c1a21b23" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:25:50 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-64ed6fdb-c332-4a79-88a0-0684cf7144a8 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lock "776d1402-3e8a-407d-a20d-db46c1a21b23" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:25:50 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-64ed6fdb-c332-4a79-88a0-0684cf7144a8 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Acquiring lock "776d1402-3e8a-407d-a20d-db46c1a21b23-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:25:50 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-64ed6fdb-c332-4a79-88a0-0684cf7144a8 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lock "776d1402-3e8a-407d-a20d-db46c1a21b23-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:25:50 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-64ed6fdb-c332-4a79-88a0-0684cf7144a8 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lock "776d1402-3e8a-407d-a20d-db46c1a21b23-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:25:50 user nova-compute[70975]: INFO nova.compute.manager [None req-64ed6fdb-c332-4a79-88a0-0684cf7144a8 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Terminating instance Apr 18 16:25:50 user nova-compute[70975]: DEBUG nova.compute.manager [None req-64ed6fdb-c332-4a79-88a0-0684cf7144a8 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Start destroying the instance on the hypervisor. {{(pid=70975) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 18 16:25:50 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:25:50 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:25:50 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:25:51 user nova-compute[70975]: DEBUG nova.compute.manager [req-99491e90-3f29-4686-b08a-ae24ec3d1ba9 req-6e7957e6-efcc-45f6-adb6-625063eb1c2a service nova] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Received event network-vif-unplugged-61df4fc4-1d98-4d7b-b74e-280c61eac6ee {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:25:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-99491e90-3f29-4686-b08a-ae24ec3d1ba9 req-6e7957e6-efcc-45f6-adb6-625063eb1c2a service nova] Acquiring lock "776d1402-3e8a-407d-a20d-db46c1a21b23-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:25:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-99491e90-3f29-4686-b08a-ae24ec3d1ba9 req-6e7957e6-efcc-45f6-adb6-625063eb1c2a service nova] Lock "776d1402-3e8a-407d-a20d-db46c1a21b23-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:25:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-99491e90-3f29-4686-b08a-ae24ec3d1ba9 req-6e7957e6-efcc-45f6-adb6-625063eb1c2a service nova] Lock "776d1402-3e8a-407d-a20d-db46c1a21b23-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:25:51 user nova-compute[70975]: DEBUG nova.compute.manager [req-99491e90-3f29-4686-b08a-ae24ec3d1ba9 req-6e7957e6-efcc-45f6-adb6-625063eb1c2a service nova] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] No waiting events found dispatching network-vif-unplugged-61df4fc4-1d98-4d7b-b74e-280c61eac6ee {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:25:51 user nova-compute[70975]: DEBUG nova.compute.manager [req-99491e90-3f29-4686-b08a-ae24ec3d1ba9 req-6e7957e6-efcc-45f6-adb6-625063eb1c2a service nova] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Received event network-vif-unplugged-61df4fc4-1d98-4d7b-b74e-280c61eac6ee for instance with task_state deleting. {{(pid=70975) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 18 16:25:51 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Instance destroyed successfully. Apr 18 16:25:51 user nova-compute[70975]: DEBUG nova.objects.instance [None req-64ed6fdb-c332-4a79-88a0-0684cf7144a8 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lazy-loading 'resources' on Instance uuid 776d1402-3e8a-407d-a20d-db46c1a21b23 {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:25:51 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-64ed6fdb-c332-4a79-88a0-0684cf7144a8 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:23:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-1809716632',display_name='tempest-TestMinimumBasicScenario-server-1809716632',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-1809716632',id=23,image_ref='96f134fc-75ab-496a-9864-a702b8f45c60',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEaEdQSpuucmjUI7CgJV9d16I0XBov7QClewDi9SugBZN+QXiQWDSnpN5XqyApn5LKFQekmnDneoXwFp0uoqimvSuH7yO+qi9pw7ukWtt3Q/33VBcmVZNDF9YXa9rca7pw==',key_name='tempest-TestMinimumBasicScenario-1759395561',keypairs=,launch_index=0,launched_at=2023-04-18T16:24:04Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='00d4b993c13b46ea8f80b0caed60a373',ramdisk_id='',reservation_id='r-kut7p5u6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='96f134fc-75ab-496a-9864-a702b8f45c60',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestMinimumBasicScenario-386776323',owner_user_name='tempest-TestMinimumBasicScenario-386776323-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-18T16:24:05Z,user_data=None,user_id='add4c9d906ba49a590f203e0aa98ab64',uuid=776d1402-3e8a-407d-a20d-db46c1a21b23,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "61df4fc4-1d98-4d7b-b74e-280c61eac6ee", "address": "fa:16:3e:87:40:bf", "network": {"id": "1cb85bc6-49f1-46be-91c8-d814b48c2a99", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1818136803-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "00d4b993c13b46ea8f80b0caed60a373", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap61df4fc4-1d", "ovs_interfaceid": "61df4fc4-1d98-4d7b-b74e-280c61eac6ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 18 16:25:51 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-64ed6fdb-c332-4a79-88a0-0684cf7144a8 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Converting VIF {"id": "61df4fc4-1d98-4d7b-b74e-280c61eac6ee", "address": "fa:16:3e:87:40:bf", "network": {"id": "1cb85bc6-49f1-46be-91c8-d814b48c2a99", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1818136803-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "00d4b993c13b46ea8f80b0caed60a373", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap61df4fc4-1d", "ovs_interfaceid": "61df4fc4-1d98-4d7b-b74e-280c61eac6ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:25:51 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-64ed6fdb-c332-4a79-88a0-0684cf7144a8 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:87:40:bf,bridge_name='br-int',has_traffic_filtering=True,id=61df4fc4-1d98-4d7b-b74e-280c61eac6ee,network=Network(1cb85bc6-49f1-46be-91c8-d814b48c2a99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61df4fc4-1d') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:25:51 user nova-compute[70975]: DEBUG os_vif [None req-64ed6fdb-c332-4a79-88a0-0684cf7144a8 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:40:bf,bridge_name='br-int',has_traffic_filtering=True,id=61df4fc4-1d98-4d7b-b74e-280c61eac6ee,network=Network(1cb85bc6-49f1-46be-91c8-d814b48c2a99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61df4fc4-1d') {{(pid=70975) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 18 16:25:51 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:25:51 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap61df4fc4-1d, bridge=br-int, if_exists=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:25:51 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:25:51 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:25:51 user nova-compute[70975]: INFO os_vif [None req-64ed6fdb-c332-4a79-88a0-0684cf7144a8 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:40:bf,bridge_name='br-int',has_traffic_filtering=True,id=61df4fc4-1d98-4d7b-b74e-280c61eac6ee,network=Network(1cb85bc6-49f1-46be-91c8-d814b48c2a99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61df4fc4-1d') Apr 18 16:25:51 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-64ed6fdb-c332-4a79-88a0-0684cf7144a8 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Deleting instance files /opt/stack/data/nova/instances/776d1402-3e8a-407d-a20d-db46c1a21b23_del Apr 18 16:25:51 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-64ed6fdb-c332-4a79-88a0-0684cf7144a8 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Deletion of /opt/stack/data/nova/instances/776d1402-3e8a-407d-a20d-db46c1a21b23_del complete Apr 18 16:25:51 user nova-compute[70975]: INFO nova.compute.manager [None req-64ed6fdb-c332-4a79-88a0-0684cf7144a8 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Took 0.64 seconds to destroy the instance on the hypervisor. Apr 18 16:25:51 user nova-compute[70975]: DEBUG oslo.service.loopingcall [None req-64ed6fdb-c332-4a79-88a0-0684cf7144a8 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70975) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 18 16:25:51 user nova-compute[70975]: DEBUG nova.compute.manager [-] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Deallocating network for instance {{(pid=70975) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 18 16:25:51 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] deallocate_for_instance() {{(pid=70975) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 18 16:25:51 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:25:51 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Updating instance_info_cache with network_info: [] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:25:51 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Took 0.43 seconds to deallocate network for instance. Apr 18 16:25:51 user nova-compute[70975]: DEBUG nova.compute.manager [req-08aabf67-5813-40f2-9d17-9e08731fa7bc req-75258900-5e32-480b-870d-a02c56642eea service nova] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Received event network-vif-deleted-61df4fc4-1d98-4d7b-b74e-280c61eac6ee {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:25:51 user nova-compute[70975]: INFO nova.compute.manager [req-08aabf67-5813-40f2-9d17-9e08731fa7bc req-75258900-5e32-480b-870d-a02c56642eea service nova] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Neutron deleted interface 61df4fc4-1d98-4d7b-b74e-280c61eac6ee; detaching it from the instance and deleting it from the info cache Apr 18 16:25:51 user nova-compute[70975]: DEBUG nova.network.neutron [req-08aabf67-5813-40f2-9d17-9e08731fa7bc req-75258900-5e32-480b-870d-a02c56642eea service nova] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Updating instance_info_cache with network_info: [] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:25:51 user nova-compute[70975]: DEBUG nova.compute.manager [req-08aabf67-5813-40f2-9d17-9e08731fa7bc req-75258900-5e32-480b-870d-a02c56642eea service nova] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Detach interface failed, port_id=61df4fc4-1d98-4d7b-b74e-280c61eac6ee, reason: Instance 776d1402-3e8a-407d-a20d-db46c1a21b23 could not be found. {{(pid=70975) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 18 16:25:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-64ed6fdb-c332-4a79-88a0-0684cf7144a8 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:25:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-64ed6fdb-c332-4a79-88a0-0684cf7144a8 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:25:52 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-64ed6fdb-c332-4a79-88a0-0684cf7144a8 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:25:52 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-64ed6fdb-c332-4a79-88a0-0684cf7144a8 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:25:52 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-64ed6fdb-c332-4a79-88a0-0684cf7144a8 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.150s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:25:52 user nova-compute[70975]: INFO nova.scheduler.client.report [None req-64ed6fdb-c332-4a79-88a0-0684cf7144a8 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Deleted allocations for instance 776d1402-3e8a-407d-a20d-db46c1a21b23 Apr 18 16:25:52 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-64ed6fdb-c332-4a79-88a0-0684cf7144a8 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lock "776d1402-3e8a-407d-a20d-db46c1a21b23" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.417s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:25:53 user nova-compute[70975]: DEBUG nova.compute.manager [req-d3dd5b4c-084e-43f3-a6c6-7acf5c12d433 req-eed0e65a-b9be-47aa-9aa5-4dbde6cfa381 service nova] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Received event network-vif-plugged-61df4fc4-1d98-4d7b-b74e-280c61eac6ee {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:25:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-d3dd5b4c-084e-43f3-a6c6-7acf5c12d433 req-eed0e65a-b9be-47aa-9aa5-4dbde6cfa381 service nova] Acquiring lock "776d1402-3e8a-407d-a20d-db46c1a21b23-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:25:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-d3dd5b4c-084e-43f3-a6c6-7acf5c12d433 req-eed0e65a-b9be-47aa-9aa5-4dbde6cfa381 service nova] Lock "776d1402-3e8a-407d-a20d-db46c1a21b23-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:25:53 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-d3dd5b4c-084e-43f3-a6c6-7acf5c12d433 req-eed0e65a-b9be-47aa-9aa5-4dbde6cfa381 service nova] Lock "776d1402-3e8a-407d-a20d-db46c1a21b23-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:25:53 user nova-compute[70975]: DEBUG nova.compute.manager [req-d3dd5b4c-084e-43f3-a6c6-7acf5c12d433 req-eed0e65a-b9be-47aa-9aa5-4dbde6cfa381 service nova] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] No waiting events found dispatching network-vif-plugged-61df4fc4-1d98-4d7b-b74e-280c61eac6ee {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:25:53 user nova-compute[70975]: WARNING nova.compute.manager [req-d3dd5b4c-084e-43f3-a6c6-7acf5c12d433 req-eed0e65a-b9be-47aa-9aa5-4dbde6cfa381 service nova] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Received unexpected event network-vif-plugged-61df4fc4-1d98-4d7b-b74e-280c61eac6ee for instance with vm_state deleted and task_state None. Apr 18 16:25:54 user nova-compute[70975]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:25:54 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] VM Stopped (Lifecycle Event) Apr 18 16:25:54 user nova-compute[70975]: DEBUG nova.compute.manager [None req-472adcfc-166f-447d-8988-b507b522068e None None] [instance: 6aece7dd-d545-4e26-9cb7-30ee0b01ebb2] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:25:56 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:26:01 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:26:02 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:26:04 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:26:04 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:26:04 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70975) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 18 16:26:05 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:26:06 user nova-compute[70975]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:26:06 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] VM Stopped (Lifecycle Event) Apr 18 16:26:06 user nova-compute[70975]: DEBUG nova.compute.manager [None req-a4b0355c-6bb7-4287-ab87-c9db51971b6a None None] [instance: 776d1402-3e8a-407d-a20d-db46c1a21b23] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:26:06 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:26:07 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:26:07 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager.update_available_resource {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:26:07 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:26:07 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:26:07 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:26:07 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Auditing locally available compute resources for user (node: user) {{(pid=70975) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 18 16:26:07 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:26:07 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:26:07 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:26:07 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json" returned: 0 in 0.158s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:26:07 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:26:07 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:26:07 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:26:08 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:26:08 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:26:08 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:26:08 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Hypervisor/Node resource view: name=user free_ram=8936MB free_disk=26.520671844482422GB free_vcpus=10 pci_devices=[{"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70975) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 18 16:26:08 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:26:08 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:26:08 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 6528f05a-9f05-4f35-b991-687e4f47029e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:26:08 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 66df9389-d007-4737-8bb1-55bcb5f227ff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:26:08 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Total usable vcpus: 12, total allocated vcpus: 2 {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 18 16:26:08 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Final resource view: name=user phys_ram=16023MB used_ram=768MB phys_disk=40GB used_disk=2GB total_vcpus=12 used_vcpus=2 pci_stats=[] {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 18 16:26:08 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:26:08 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:26:08 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Compute_service record updated for user:user {{(pid=70975) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 18 16:26:08 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.216s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:26:09 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:26:09 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Starting heal instance info cache {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 18 16:26:09 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Didn't find any instances for network info cache update. {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 18 16:26:09 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:26:11 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:26:11 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:26:16 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:26:16 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:26:21 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:26:26 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:26:30 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-a8590daf-08a5-43ba-affc-5031b86feab3 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Acquiring lock "6528f05a-9f05-4f35-b991-687e4f47029e" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:26:30 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-a8590daf-08a5-43ba-affc-5031b86feab3 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "6528f05a-9f05-4f35-b991-687e4f47029e" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:26:30 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-a8590daf-08a5-43ba-affc-5031b86feab3 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Acquiring lock "6528f05a-9f05-4f35-b991-687e4f47029e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:26:30 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-a8590daf-08a5-43ba-affc-5031b86feab3 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "6528f05a-9f05-4f35-b991-687e4f47029e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:26:30 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-a8590daf-08a5-43ba-affc-5031b86feab3 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "6528f05a-9f05-4f35-b991-687e4f47029e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:26:30 user nova-compute[70975]: INFO nova.compute.manager [None req-a8590daf-08a5-43ba-affc-5031b86feab3 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Terminating instance Apr 18 16:26:30 user nova-compute[70975]: DEBUG nova.compute.manager [None req-a8590daf-08a5-43ba-affc-5031b86feab3 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Start destroying the instance on the hypervisor. {{(pid=70975) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 18 16:26:30 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:26:30 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:26:30 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:26:30 user nova-compute[70975]: DEBUG nova.compute.manager [req-6561f61d-c823-44cf-9946-a6ec7054f8e5 req-3f5e87c5-01ac-44e3-bed7-82197f0d6e6e service nova] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Received event network-vif-unplugged-08164ae1-ace4-4d80-ad79-1741eacfa16e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:26:30 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-6561f61d-c823-44cf-9946-a6ec7054f8e5 req-3f5e87c5-01ac-44e3-bed7-82197f0d6e6e service nova] Acquiring lock "6528f05a-9f05-4f35-b991-687e4f47029e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:26:30 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-6561f61d-c823-44cf-9946-a6ec7054f8e5 req-3f5e87c5-01ac-44e3-bed7-82197f0d6e6e service nova] Lock "6528f05a-9f05-4f35-b991-687e4f47029e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:26:30 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-6561f61d-c823-44cf-9946-a6ec7054f8e5 req-3f5e87c5-01ac-44e3-bed7-82197f0d6e6e service nova] Lock "6528f05a-9f05-4f35-b991-687e4f47029e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:26:30 user nova-compute[70975]: DEBUG nova.compute.manager [req-6561f61d-c823-44cf-9946-a6ec7054f8e5 req-3f5e87c5-01ac-44e3-bed7-82197f0d6e6e service nova] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] No waiting events found dispatching network-vif-unplugged-08164ae1-ace4-4d80-ad79-1741eacfa16e {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:26:30 user nova-compute[70975]: DEBUG nova.compute.manager [req-6561f61d-c823-44cf-9946-a6ec7054f8e5 req-3f5e87c5-01ac-44e3-bed7-82197f0d6e6e service nova] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Received event network-vif-unplugged-08164ae1-ace4-4d80-ad79-1741eacfa16e for instance with task_state deleting. {{(pid=70975) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 18 16:26:30 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Instance destroyed successfully. Apr 18 16:26:30 user nova-compute[70975]: DEBUG nova.objects.instance [None req-a8590daf-08a5-43ba-affc-5031b86feab3 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lazy-loading 'resources' on Instance uuid 6528f05a-9f05-4f35-b991-687e4f47029e {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:26:30 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-a8590daf-08a5-43ba-affc-5031b86feab3 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:17:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1865674245',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1865674245',id=14,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-18T16:18:06Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='f516f5ec45ca4508841c77f79e8c038b',ramdisk_id='',reservation_id='r-ad3dbxxv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-2021464272',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-18T16:19:56Z,user_data=None,user_id='c54c277689214bd0a2cadb1e2ac288a9',uuid=6528f05a-9f05-4f35-b991-687e4f47029e,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "08164ae1-ace4-4d80-ad79-1741eacfa16e", "address": "fa:16:3e:28:00:5b", "network": {"id": "923d10dc-c67e-4426-9c6e-856e903e2446", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-131274818-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f516f5ec45ca4508841c77f79e8c038b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap08164ae1-ac", "ovs_interfaceid": "08164ae1-ace4-4d80-ad79-1741eacfa16e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 18 16:26:30 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-a8590daf-08a5-43ba-affc-5031b86feab3 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Converting VIF {"id": "08164ae1-ace4-4d80-ad79-1741eacfa16e", "address": "fa:16:3e:28:00:5b", "network": {"id": "923d10dc-c67e-4426-9c6e-856e903e2446", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-131274818-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f516f5ec45ca4508841c77f79e8c038b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap08164ae1-ac", "ovs_interfaceid": "08164ae1-ace4-4d80-ad79-1741eacfa16e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:26:30 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-a8590daf-08a5-43ba-affc-5031b86feab3 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:28:00:5b,bridge_name='br-int',has_traffic_filtering=True,id=08164ae1-ace4-4d80-ad79-1741eacfa16e,network=Network(923d10dc-c67e-4426-9c6e-856e903e2446),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap08164ae1-ac') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:26:30 user nova-compute[70975]: DEBUG os_vif [None req-a8590daf-08a5-43ba-affc-5031b86feab3 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:28:00:5b,bridge_name='br-int',has_traffic_filtering=True,id=08164ae1-ace4-4d80-ad79-1741eacfa16e,network=Network(923d10dc-c67e-4426-9c6e-856e903e2446),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap08164ae1-ac') {{(pid=70975) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 18 16:26:30 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:26:30 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08164ae1-ac, bridge=br-int, if_exists=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:26:30 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:26:30 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:26:30 user nova-compute[70975]: INFO os_vif [None req-a8590daf-08a5-43ba-affc-5031b86feab3 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:28:00:5b,bridge_name='br-int',has_traffic_filtering=True,id=08164ae1-ace4-4d80-ad79-1741eacfa16e,network=Network(923d10dc-c67e-4426-9c6e-856e903e2446),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap08164ae1-ac') Apr 18 16:26:30 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-a8590daf-08a5-43ba-affc-5031b86feab3 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Deleting instance files /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e_del Apr 18 16:26:30 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-a8590daf-08a5-43ba-affc-5031b86feab3 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Deletion of /opt/stack/data/nova/instances/6528f05a-9f05-4f35-b991-687e4f47029e_del complete Apr 18 16:26:30 user nova-compute[70975]: INFO nova.compute.manager [None req-a8590daf-08a5-43ba-affc-5031b86feab3 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Took 0.65 seconds to destroy the instance on the hypervisor. Apr 18 16:26:30 user nova-compute[70975]: DEBUG oslo.service.loopingcall [None req-a8590daf-08a5-43ba-affc-5031b86feab3 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70975) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 18 16:26:30 user nova-compute[70975]: DEBUG nova.compute.manager [-] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Deallocating network for instance {{(pid=70975) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 18 16:26:30 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] deallocate_for_instance() {{(pid=70975) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 18 16:26:31 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:26:31 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Updating instance_info_cache with network_info: [] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:26:31 user nova-compute[70975]: DEBUG nova.compute.manager [req-8c8daf74-c039-4f73-be87-9a1be3e9d8cf req-f4a9d65b-34be-4e10-bcff-80553a91b7e1 service nova] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Received event network-vif-deleted-08164ae1-ace4-4d80-ad79-1741eacfa16e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:26:31 user nova-compute[70975]: INFO nova.compute.manager [req-8c8daf74-c039-4f73-be87-9a1be3e9d8cf req-f4a9d65b-34be-4e10-bcff-80553a91b7e1 service nova] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Neutron deleted interface 08164ae1-ace4-4d80-ad79-1741eacfa16e; detaching it from the instance and deleting it from the info cache Apr 18 16:26:31 user nova-compute[70975]: DEBUG nova.network.neutron [req-8c8daf74-c039-4f73-be87-9a1be3e9d8cf req-f4a9d65b-34be-4e10-bcff-80553a91b7e1 service nova] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Updating instance_info_cache with network_info: [] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:26:31 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Took 0.62 seconds to deallocate network for instance. Apr 18 16:26:31 user nova-compute[70975]: DEBUG nova.compute.manager [req-8c8daf74-c039-4f73-be87-9a1be3e9d8cf req-f4a9d65b-34be-4e10-bcff-80553a91b7e1 service nova] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Detach interface failed, port_id=08164ae1-ace4-4d80-ad79-1741eacfa16e, reason: Instance 6528f05a-9f05-4f35-b991-687e4f47029e could not be found. {{(pid=70975) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 18 16:26:31 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-a8590daf-08a5-43ba-affc-5031b86feab3 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:26:31 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-a8590daf-08a5-43ba-affc-5031b86feab3 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:26:31 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-a8590daf-08a5-43ba-affc-5031b86feab3 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:26:31 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-a8590daf-08a5-43ba-affc-5031b86feab3 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:26:31 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-a8590daf-08a5-43ba-affc-5031b86feab3 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.134s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:26:31 user nova-compute[70975]: INFO nova.scheduler.client.report [None req-a8590daf-08a5-43ba-affc-5031b86feab3 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Deleted allocations for instance 6528f05a-9f05-4f35-b991-687e4f47029e Apr 18 16:26:31 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-a8590daf-08a5-43ba-affc-5031b86feab3 tempest-ServerBootFromVolumeStableRescueTest-2021464272 tempest-ServerBootFromVolumeStableRescueTest-2021464272-project-member] Lock "6528f05a-9f05-4f35-b991-687e4f47029e" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.582s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:26:32 user nova-compute[70975]: DEBUG nova.compute.manager [req-58c294dd-f772-4fef-b180-50d28c08c7ce req-b467f45d-24fb-47a2-8af0-beb8e67136e5 service nova] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Received event network-vif-plugged-08164ae1-ace4-4d80-ad79-1741eacfa16e {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:26:32 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-58c294dd-f772-4fef-b180-50d28c08c7ce req-b467f45d-24fb-47a2-8af0-beb8e67136e5 service nova] Acquiring lock "6528f05a-9f05-4f35-b991-687e4f47029e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:26:32 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-58c294dd-f772-4fef-b180-50d28c08c7ce req-b467f45d-24fb-47a2-8af0-beb8e67136e5 service nova] Lock "6528f05a-9f05-4f35-b991-687e4f47029e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:26:32 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-58c294dd-f772-4fef-b180-50d28c08c7ce req-b467f45d-24fb-47a2-8af0-beb8e67136e5 service nova] Lock "6528f05a-9f05-4f35-b991-687e4f47029e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:26:32 user nova-compute[70975]: DEBUG nova.compute.manager [req-58c294dd-f772-4fef-b180-50d28c08c7ce req-b467f45d-24fb-47a2-8af0-beb8e67136e5 service nova] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] No waiting events found dispatching network-vif-plugged-08164ae1-ace4-4d80-ad79-1741eacfa16e {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:26:32 user nova-compute[70975]: WARNING nova.compute.manager [req-58c294dd-f772-4fef-b180-50d28c08c7ce req-b467f45d-24fb-47a2-8af0-beb8e67136e5 service nova] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Received unexpected event network-vif-plugged-08164ae1-ace4-4d80-ad79-1741eacfa16e for instance with vm_state deleted and task_state None. Apr 18 16:26:35 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:26:40 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:26:40 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:26:40 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=70975) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 18 16:26:40 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70975) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 18 16:26:40 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70975) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 18 16:26:40 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:26:44 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Acquiring lock "2e71dfcf-8ffe-47de-9922-ad5a01615168" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:26:44 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lock "2e71dfcf-8ffe-47de-9922-ad5a01615168" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:26:44 user nova-compute[70975]: DEBUG nova.compute.manager [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Starting instance... {{(pid=70975) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 18 16:26:44 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:26:44 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:26:44 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70975) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 18 16:26:44 user nova-compute[70975]: INFO nova.compute.claims [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Claim successful on node user Apr 18 16:26:44 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:26:44 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:26:44 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.225s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:26:44 user nova-compute[70975]: DEBUG nova.compute.manager [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Start building networks asynchronously for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 18 16:26:45 user nova-compute[70975]: DEBUG nova.compute.manager [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Allocating IP information in the background. {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 18 16:26:45 user nova-compute[70975]: DEBUG nova.network.neutron [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] allocate_for_instance() {{(pid=70975) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 18 16:26:45 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 18 16:26:45 user nova-compute[70975]: DEBUG nova.compute.manager [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Start building block device mappings for instance. {{(pid=70975) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 18 16:26:45 user nova-compute[70975]: DEBUG nova.policy [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'add4c9d906ba49a590f203e0aa98ab64', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '00d4b993c13b46ea8f80b0caed60a373', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70975) authorize /opt/stack/nova/nova/policy.py:203}} Apr 18 16:26:45 user nova-compute[70975]: DEBUG nova.compute.manager [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Start spawning the instance on the hypervisor. {{(pid=70975) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 18 16:26:45 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Creating instance directory {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 18 16:26:45 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Creating image(s) Apr 18 16:26:45 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Acquiring lock "/opt/stack/data/nova/instances/2e71dfcf-8ffe-47de-9922-ad5a01615168/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:26:45 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lock "/opt/stack/data/nova/instances/2e71dfcf-8ffe-47de-9922-ad5a01615168/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:26:45 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lock "/opt/stack/data/nova/instances/2e71dfcf-8ffe-47de-9922-ad5a01615168/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:26:45 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Acquiring lock "5cddd989d931fd16c7948af549631d69806aef6d" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:26:45 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lock "5cddd989d931fd16c7948af549631d69806aef6d" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:26:46 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/5cddd989d931fd16c7948af549631d69806aef6d.part --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:26:46 user nova-compute[70975]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:26:46 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] VM Stopped (Lifecycle Event) Apr 18 16:26:46 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:26:46 user nova-compute[70975]: DEBUG nova.compute.manager [None req-d358a92e-c40e-4a51-a174-e950dff04f9b None None] [instance: 6528f05a-9f05-4f35-b991-687e4f47029e] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:26:46 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/5cddd989d931fd16c7948af549631d69806aef6d.part --force-share --output=json" returned: 0 in 0.145s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:26:46 user nova-compute[70975]: DEBUG nova.virt.images [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] ec984284-e2e2-460c-8d57-fd257eb433e3 was qcow2, converting to raw {{(pid=70975) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Apr 18 16:26:46 user nova-compute[70975]: DEBUG nova.privsep.utils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=70975) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 18 16:26:46 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/5cddd989d931fd16c7948af549631d69806aef6d.part /opt/stack/data/nova/instances/_base/5cddd989d931fd16c7948af549631d69806aef6d.converted {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:26:46 user nova-compute[70975]: DEBUG nova.network.neutron [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Successfully created port: a625f474-3393-4722-8b87-e0fcff38705b {{(pid=70975) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 18 16:26:46 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/5cddd989d931fd16c7948af549631d69806aef6d.part /opt/stack/data/nova/instances/_base/5cddd989d931fd16c7948af549631d69806aef6d.converted" returned: 0 in 0.341s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:26:46 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/5cddd989d931fd16c7948af549631d69806aef6d.converted --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:26:46 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/5cddd989d931fd16c7948af549631d69806aef6d.converted --force-share --output=json" returned: 0 in 0.132s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:26:46 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lock "5cddd989d931fd16c7948af549631d69806aef6d" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 1.617s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:26:46 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/5cddd989d931fd16c7948af549631d69806aef6d --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:26:46 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/5cddd989d931fd16c7948af549631d69806aef6d --force-share --output=json" returned: 0 in 0.136s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:26:46 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Acquiring lock "5cddd989d931fd16c7948af549631d69806aef6d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:26:46 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lock "5cddd989d931fd16c7948af549631d69806aef6d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:26:46 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/5cddd989d931fd16c7948af549631d69806aef6d --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/5cddd989d931fd16c7948af549631d69806aef6d --force-share --output=json" returned: 0 in 0.137s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/5cddd989d931fd16c7948af549631d69806aef6d,backing_fmt=raw /opt/stack/data/nova/instances/2e71dfcf-8ffe-47de-9922-ad5a01615168/disk 1073741824 {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG nova.network.neutron [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Successfully updated port: a625f474-3393-4722-8b87-e0fcff38705b {{(pid=70975) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/5cddd989d931fd16c7948af549631d69806aef6d,backing_fmt=raw /opt/stack/data/nova/instances/2e71dfcf-8ffe-47de-9922-ad5a01615168/disk 1073741824" returned: 0 in 0.051s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lock "5cddd989d931fd16c7948af549631d69806aef6d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.194s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/5cddd989d931fd16c7948af549631d69806aef6d --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Acquiring lock "refresh_cache-2e71dfcf-8ffe-47de-9922-ad5a01615168" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Acquired lock "refresh_cache-2e71dfcf-8ffe-47de-9922-ad5a01615168" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG nova.network.neutron [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Building network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG nova.compute.manager [req-ee1e9453-7ae5-4c22-aef8-dd1e7b2e7b74 req-1d28bdbd-157d-4772-8e24-34bf76b33989 service nova] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Received event network-changed-a625f474-3393-4722-8b87-e0fcff38705b {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG nova.compute.manager [req-ee1e9453-7ae5-4c22-aef8-dd1e7b2e7b74 req-1d28bdbd-157d-4772-8e24-34bf76b33989 service nova] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Refreshing instance network info cache due to event network-changed-a625f474-3393-4722-8b87-e0fcff38705b. {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-ee1e9453-7ae5-4c22-aef8-dd1e7b2e7b74 req-1d28bdbd-157d-4772-8e24-34bf76b33989 service nova] Acquiring lock "refresh_cache-2e71dfcf-8ffe-47de-9922-ad5a01615168" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG nova.network.neutron [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Instance cache missing network info. {{(pid=70975) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/5cddd989d931fd16c7948af549631d69806aef6d --force-share --output=json" returned: 0 in 0.136s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Checking if we can resize image /opt/stack/data/nova/instances/2e71dfcf-8ffe-47de-9922-ad5a01615168/disk. size=1073741824 {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2e71dfcf-8ffe-47de-9922-ad5a01615168/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2e71dfcf-8ffe-47de-9922-ad5a01615168/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG nova.virt.disk.api [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Cannot resize image /opt/stack/data/nova/instances/2e71dfcf-8ffe-47de-9922-ad5a01615168/disk to a smaller size. {{(pid=70975) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG nova.objects.instance [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lazy-loading 'migration_context' on Instance uuid 2e71dfcf-8ffe-47de-9922-ad5a01615168 {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Created local disks {{(pid=70975) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Ensure instance console log exists: /opt/stack/data/nova/instances/2e71dfcf-8ffe-47de-9922-ad5a01615168/console.log {{(pid=70975) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG nova.network.neutron [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Updating instance_info_cache with network_info: [{"id": "a625f474-3393-4722-8b87-e0fcff38705b", "address": "fa:16:3e:35:85:e4", "network": {"id": "1cb85bc6-49f1-46be-91c8-d814b48c2a99", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1818136803-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "00d4b993c13b46ea8f80b0caed60a373", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa625f474-33", "ovs_interfaceid": "a625f474-3393-4722-8b87-e0fcff38705b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Releasing lock "refresh_cache-2e71dfcf-8ffe-47de-9922-ad5a01615168" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG nova.compute.manager [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Instance network_info: |[{"id": "a625f474-3393-4722-8b87-e0fcff38705b", "address": "fa:16:3e:35:85:e4", "network": {"id": "1cb85bc6-49f1-46be-91c8-d814b48c2a99", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1818136803-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "00d4b993c13b46ea8f80b0caed60a373", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa625f474-33", "ovs_interfaceid": "a625f474-3393-4722-8b87-e0fcff38705b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70975) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-ee1e9453-7ae5-4c22-aef8-dd1e7b2e7b74 req-1d28bdbd-157d-4772-8e24-34bf76b33989 service nova] Acquired lock "refresh_cache-2e71dfcf-8ffe-47de-9922-ad5a01615168" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG nova.network.neutron [req-ee1e9453-7ae5-4c22-aef8-dd1e7b2e7b74 req-1d28bdbd-157d-4772-8e24-34bf76b33989 service nova] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Refreshing network info cache for port a625f474-3393-4722-8b87-e0fcff38705b {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Start _get_guest_xml network_info=[{"id": "a625f474-3393-4722-8b87-e0fcff38705b", "address": "fa:16:3e:35:85:e4", "network": {"id": "1cb85bc6-49f1-46be-91c8-d814b48c2a99", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1818136803-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "00d4b993c13b46ea8f80b0caed60a373", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa625f474-33", "ovs_interfaceid": "a625f474-3393-4722-8b87-e0fcff38705b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:26:42Z,direct_url=,disk_format='qcow2',id=ec984284-e2e2-460c-8d57-fd257eb433e3,min_disk=0,min_ram=0,name='tempest-scenario-img--815805754',owner='00d4b993c13b46ea8f80b0caed60a373',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:26:43Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'size': 0, 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encrypted': False, 'device_type': 'disk', 'encryption_secret_uuid': None, 'encryption_format': None, 'disk_bus': 'virtio', 'image_id': 'ec984284-e2e2-460c-8d57-fd257eb433e3'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 18 16:26:47 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:26:47 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:26:47 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70975) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-18T16:11:27Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-18T16:26:42Z,direct_url=,disk_format='qcow2',id=ec984284-e2e2-460c-8d57-fd257eb433e3,min_disk=0,min_ram=0,name='tempest-scenario-img--815805754',owner='00d4b993c13b46ea8f80b0caed60a373',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-18T16:26:43Z,virtual_size=,visibility=), allow threads: True {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Flavor limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Image limits 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Flavor pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Image pref 0:0:0 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70975) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Got 1 possible topologies {{(pid=70975) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG nova.virt.hardware [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70975) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:26:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-645939165',display_name='tempest-TestMinimumBasicScenario-server-645939165',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-645939165',id=25,image_ref='ec984284-e2e2-460c-8d57-fd257eb433e3',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFTjzjFXAU6TCf5cUZ4HAW/uTn6Y3TUbR85mBRl2+FlNM0NVwWI69f9l4ZtVvR9kFoEORm5vcNFscXnnbdpx5zWg+f/sXB25kjR5e5QBGQL42u3J1dK+/RPVXZi6Tm2BzQ==',key_name='tempest-TestMinimumBasicScenario-1240941080',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='00d4b993c13b46ea8f80b0caed60a373',ramdisk_id='',reservation_id='r-dt6ukame',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ec984284-e2e2-460c-8d57-fd257eb433e3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-386776323',owner_user_name='tempest-TestMinimumBasicScenario-386776323-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:26:45Z,user_data=None,user_id='add4c9d906ba49a590f203e0aa98ab64',uuid=2e71dfcf-8ffe-47de-9922-ad5a01615168,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a625f474-3393-4722-8b87-e0fcff38705b", "address": "fa:16:3e:35:85:e4", "network": {"id": "1cb85bc6-49f1-46be-91c8-d814b48c2a99", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1818136803-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "00d4b993c13b46ea8f80b0caed60a373", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa625f474-33", "ovs_interfaceid": "a625f474-3393-4722-8b87-e0fcff38705b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70975) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Converting VIF {"id": "a625f474-3393-4722-8b87-e0fcff38705b", "address": "fa:16:3e:35:85:e4", "network": {"id": "1cb85bc6-49f1-46be-91c8-d814b48c2a99", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1818136803-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "00d4b993c13b46ea8f80b0caed60a373", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa625f474-33", "ovs_interfaceid": "a625f474-3393-4722-8b87-e0fcff38705b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:85:e4,bridge_name='br-int',has_traffic_filtering=True,id=a625f474-3393-4722-8b87-e0fcff38705b,network=Network(1cb85bc6-49f1-46be-91c8-d814b48c2a99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa625f474-33') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG nova.objects.instance [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lazy-loading 'pci_devices' on Instance uuid 2e71dfcf-8ffe-47de-9922-ad5a01615168 {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] End _get_guest_xml xml= Apr 18 16:26:47 user nova-compute[70975]: 2e71dfcf-8ffe-47de-9922-ad5a01615168 Apr 18 16:26:47 user nova-compute[70975]: instance-00000019 Apr 18 16:26:47 user nova-compute[70975]: 131072 Apr 18 16:26:47 user nova-compute[70975]: 1 Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: tempest-TestMinimumBasicScenario-server-645939165 Apr 18 16:26:47 user nova-compute[70975]: 2023-04-18 16:26:47 Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: 128 Apr 18 16:26:47 user nova-compute[70975]: 1 Apr 18 16:26:47 user nova-compute[70975]: 0 Apr 18 16:26:47 user nova-compute[70975]: 0 Apr 18 16:26:47 user nova-compute[70975]: 1 Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: tempest-TestMinimumBasicScenario-386776323-project-member Apr 18 16:26:47 user nova-compute[70975]: tempest-TestMinimumBasicScenario-386776323 Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: OpenStack Foundation Apr 18 16:26:47 user nova-compute[70975]: OpenStack Nova Apr 18 16:26:47 user nova-compute[70975]: 0.0.0 Apr 18 16:26:47 user nova-compute[70975]: 2e71dfcf-8ffe-47de-9922-ad5a01615168 Apr 18 16:26:47 user nova-compute[70975]: 2e71dfcf-8ffe-47de-9922-ad5a01615168 Apr 18 16:26:47 user nova-compute[70975]: Virtual Machine Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: hvm Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Nehalem Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: /dev/urandom Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: Apr 18 16:26:47 user nova-compute[70975]: {{(pid=70975) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:26:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-645939165',display_name='tempest-TestMinimumBasicScenario-server-645939165',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-645939165',id=25,image_ref='ec984284-e2e2-460c-8d57-fd257eb433e3',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFTjzjFXAU6TCf5cUZ4HAW/uTn6Y3TUbR85mBRl2+FlNM0NVwWI69f9l4ZtVvR9kFoEORm5vcNFscXnnbdpx5zWg+f/sXB25kjR5e5QBGQL42u3J1dK+/RPVXZi6Tm2BzQ==',key_name='tempest-TestMinimumBasicScenario-1240941080',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='00d4b993c13b46ea8f80b0caed60a373',ramdisk_id='',reservation_id='r-dt6ukame',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ec984284-e2e2-460c-8d57-fd257eb433e3',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-386776323',owner_user_name='tempest-TestMinimumBasicScenario-386776323-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-18T16:26:45Z,user_data=None,user_id='add4c9d906ba49a590f203e0aa98ab64',uuid=2e71dfcf-8ffe-47de-9922-ad5a01615168,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a625f474-3393-4722-8b87-e0fcff38705b", "address": "fa:16:3e:35:85:e4", "network": {"id": "1cb85bc6-49f1-46be-91c8-d814b48c2a99", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1818136803-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "00d4b993c13b46ea8f80b0caed60a373", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa625f474-33", "ovs_interfaceid": "a625f474-3393-4722-8b87-e0fcff38705b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Converting VIF {"id": "a625f474-3393-4722-8b87-e0fcff38705b", "address": "fa:16:3e:35:85:e4", "network": {"id": "1cb85bc6-49f1-46be-91c8-d814b48c2a99", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1818136803-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "00d4b993c13b46ea8f80b0caed60a373", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa625f474-33", "ovs_interfaceid": "a625f474-3393-4722-8b87-e0fcff38705b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:85:e4,bridge_name='br-int',has_traffic_filtering=True,id=a625f474-3393-4722-8b87-e0fcff38705b,network=Network(1cb85bc6-49f1-46be-91c8-d814b48c2a99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa625f474-33') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG os_vif [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:85:e4,bridge_name='br-int',has_traffic_filtering=True,id=a625f474-3393-4722-8b87-e0fcff38705b,network=Network(1cb85bc6-49f1-46be-91c8-d814b48c2a99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa625f474-33') {{(pid=70975) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa625f474-33, may_exist=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa625f474-33, col_values=(('external_ids', {'iface-id': 'a625f474-3393-4722-8b87-e0fcff38705b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:35:85:e4', 'vm-uuid': '2e71dfcf-8ffe-47de-9922-ad5a01615168'}),)) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:26:47 user nova-compute[70975]: INFO os_vif [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:85:e4,bridge_name='br-int',has_traffic_filtering=True,id=a625f474-3393-4722-8b87-e0fcff38705b,network=Network(1cb85bc6-49f1-46be-91c8-d814b48c2a99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa625f474-33') Apr 18 16:26:47 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] No BDM found with device name vda, not building metadata. {{(pid=70975) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 18 16:26:47 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] No VIF found with MAC fa:16:3e:35:85:e4, not building metadata {{(pid=70975) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 18 16:26:48 user nova-compute[70975]: DEBUG nova.network.neutron [req-ee1e9453-7ae5-4c22-aef8-dd1e7b2e7b74 req-1d28bdbd-157d-4772-8e24-34bf76b33989 service nova] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Updated VIF entry in instance network info cache for port a625f474-3393-4722-8b87-e0fcff38705b. {{(pid=70975) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 18 16:26:48 user nova-compute[70975]: DEBUG nova.network.neutron [req-ee1e9453-7ae5-4c22-aef8-dd1e7b2e7b74 req-1d28bdbd-157d-4772-8e24-34bf76b33989 service nova] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Updating instance_info_cache with network_info: [{"id": "a625f474-3393-4722-8b87-e0fcff38705b", "address": "fa:16:3e:35:85:e4", "network": {"id": "1cb85bc6-49f1-46be-91c8-d814b48c2a99", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1818136803-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "00d4b993c13b46ea8f80b0caed60a373", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa625f474-33", "ovs_interfaceid": "a625f474-3393-4722-8b87-e0fcff38705b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:26:48 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-ee1e9453-7ae5-4c22-aef8-dd1e7b2e7b74 req-1d28bdbd-157d-4772-8e24-34bf76b33989 service nova] Releasing lock "refresh_cache-2e71dfcf-8ffe-47de-9922-ad5a01615168" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:26:48 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:26:49 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:26:49 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:26:49 user nova-compute[70975]: DEBUG nova.compute.manager [req-5e513f5d-94fd-42b6-8991-e12eafd42266 req-fb4991f8-2b03-4a2b-874d-e846c0f1f91b service nova] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Received event network-vif-plugged-a625f474-3393-4722-8b87-e0fcff38705b {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:26:49 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-5e513f5d-94fd-42b6-8991-e12eafd42266 req-fb4991f8-2b03-4a2b-874d-e846c0f1f91b service nova] Acquiring lock "2e71dfcf-8ffe-47de-9922-ad5a01615168-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:26:49 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-5e513f5d-94fd-42b6-8991-e12eafd42266 req-fb4991f8-2b03-4a2b-874d-e846c0f1f91b service nova] Lock "2e71dfcf-8ffe-47de-9922-ad5a01615168-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:26:49 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-5e513f5d-94fd-42b6-8991-e12eafd42266 req-fb4991f8-2b03-4a2b-874d-e846c0f1f91b service nova] Lock "2e71dfcf-8ffe-47de-9922-ad5a01615168-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:26:49 user nova-compute[70975]: DEBUG nova.compute.manager [req-5e513f5d-94fd-42b6-8991-e12eafd42266 req-fb4991f8-2b03-4a2b-874d-e846c0f1f91b service nova] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] No waiting events found dispatching network-vif-plugged-a625f474-3393-4722-8b87-e0fcff38705b {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:26:49 user nova-compute[70975]: WARNING nova.compute.manager [req-5e513f5d-94fd-42b6-8991-e12eafd42266 req-fb4991f8-2b03-4a2b-874d-e846c0f1f91b service nova] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Received unexpected event network-vif-plugged-a625f474-3393-4722-8b87-e0fcff38705b for instance with vm_state building and task_state spawning. Apr 18 16:26:49 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:26:49 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:26:49 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:26:51 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Resumed> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:26:51 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] VM Resumed (Lifecycle Event) Apr 18 16:26:51 user nova-compute[70975]: DEBUG nova.compute.manager [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Instance event wait completed in 0 seconds for {{(pid=70975) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 18 16:26:51 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Guest created on hypervisor {{(pid=70975) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 18 16:26:51 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Instance spawned successfully. Apr 18 16:26:51 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 18 16:26:51 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:26:51 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:26:51 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Found default for hw_cdrom_bus of ide {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:26:51 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Found default for hw_disk_bus of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:26:51 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Found default for hw_input_bus of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:26:51 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Found default for hw_pointer_model of None {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:26:51 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Found default for hw_video_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:26:51 user nova-compute[70975]: DEBUG nova.virt.libvirt.driver [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Found default for hw_vif_model of virtio {{(pid=70975) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 18 16:26:51 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:26:51 user nova-compute[70975]: DEBUG nova.virt.driver [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] Emitting event Started> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:26:51 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] VM Started (Lifecycle Event) Apr 18 16:26:51 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:26:51 user nova-compute[70975]: DEBUG nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70975) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 18 16:26:51 user nova-compute[70975]: INFO nova.compute.manager [None req-f375217b-f4da-441a-bd7d-cb9ef0703933 None None] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] During sync_power_state the instance has a pending task (spawning). Skip. Apr 18 16:26:51 user nova-compute[70975]: INFO nova.compute.manager [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Took 5.99 seconds to spawn the instance on the hypervisor. Apr 18 16:26:51 user nova-compute[70975]: DEBUG nova.compute.manager [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:26:51 user nova-compute[70975]: INFO nova.compute.manager [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Took 6.54 seconds to build instance. Apr 18 16:26:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-afcec3f2-471b-4f08-a03e-8296da751340 tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lock "2e71dfcf-8ffe-47de-9922-ad5a01615168" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.621s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:26:51 user nova-compute[70975]: DEBUG nova.compute.manager [req-a3489670-e4e0-4a42-93e3-9724e5b9e83b req-8478acd9-36a9-4bca-8555-eae42ddfbd37 service nova] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Received event network-vif-plugged-a625f474-3393-4722-8b87-e0fcff38705b {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:26:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-a3489670-e4e0-4a42-93e3-9724e5b9e83b req-8478acd9-36a9-4bca-8555-eae42ddfbd37 service nova] Acquiring lock "2e71dfcf-8ffe-47de-9922-ad5a01615168-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:26:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-a3489670-e4e0-4a42-93e3-9724e5b9e83b req-8478acd9-36a9-4bca-8555-eae42ddfbd37 service nova] Lock "2e71dfcf-8ffe-47de-9922-ad5a01615168-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.002s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:26:51 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-a3489670-e4e0-4a42-93e3-9724e5b9e83b req-8478acd9-36a9-4bca-8555-eae42ddfbd37 service nova] Lock "2e71dfcf-8ffe-47de-9922-ad5a01615168-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:26:51 user nova-compute[70975]: DEBUG nova.compute.manager [req-a3489670-e4e0-4a42-93e3-9724e5b9e83b req-8478acd9-36a9-4bca-8555-eae42ddfbd37 service nova] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] No waiting events found dispatching network-vif-plugged-a625f474-3393-4722-8b87-e0fcff38705b {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:26:51 user nova-compute[70975]: WARNING nova.compute.manager [req-a3489670-e4e0-4a42-93e3-9724e5b9e83b req-8478acd9-36a9-4bca-8555-eae42ddfbd37 service nova] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Received unexpected event network-vif-plugged-a625f474-3393-4722-8b87-e0fcff38705b for instance with vm_state active and task_state None. Apr 18 16:26:52 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:26:52 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:26:57 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:27:02 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:27:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:27:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:27:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=70975) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 18 16:27:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70975) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 18 16:27:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70975) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 18 16:27:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:27:04 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:27:04 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70975) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 18 16:27:05 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:27:05 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:27:07 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:27:07 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:27:08 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:27:08 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Starting heal instance info cache {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 18 16:27:08 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Rebuilding the list of instances to heal {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 18 16:27:08 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "refresh_cache-66df9389-d007-4737-8bb1-55bcb5f227ff" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:27:08 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquired lock "refresh_cache-66df9389-d007-4737-8bb1-55bcb5f227ff" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:27:08 user nova-compute[70975]: DEBUG nova.network.neutron [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Forcefully refreshing network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 18 16:27:08 user nova-compute[70975]: DEBUG nova.objects.instance [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lazy-loading 'info_cache' on Instance uuid 66df9389-d007-4737-8bb1-55bcb5f227ff {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:27:09 user nova-compute[70975]: DEBUG nova.network.neutron [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Updating instance_info_cache with network_info: [{"id": "b66d41ab-873c-4826-a3f8-d4f4276fff10", "address": "fa:16:3e:58:32:25", "network": {"id": "236fa8aa-433b-4dfa-a787-f165c3389489", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1486162327-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5695adbb14ea4162bc40547b1509a1e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb66d41ab-87", "ovs_interfaceid": "b66d41ab-873c-4826-a3f8-d4f4276fff10", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:27:09 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Releasing lock "refresh_cache-66df9389-d007-4737-8bb1-55bcb5f227ff" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:27:09 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Updated the network info_cache for instance {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 18 16:27:09 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:27:09 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager.update_available_resource {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:27:09 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:27:09 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:27:09 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:27:09 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Auditing locally available compute resources for user (node: user) {{(pid=70975) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 18 16:27:09 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:27:09 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:27:09 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:27:09 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:27:09 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2e71dfcf-8ffe-47de-9922-ad5a01615168/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:27:09 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2e71dfcf-8ffe-47de-9922-ad5a01615168/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:27:09 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2e71dfcf-8ffe-47de-9922-ad5a01615168/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:27:10 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2e71dfcf-8ffe-47de-9922-ad5a01615168/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:27:10 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:27:10 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:27:10 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Hypervisor/Node resource view: name=user free_ram=8912MB free_disk=26.538818359375GB free_vcpus=10 pci_devices=[{"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70975) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 18 16:27:10 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:27:10 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:27:10 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 66df9389-d007-4737-8bb1-55bcb5f227ff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:27:10 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 2e71dfcf-8ffe-47de-9922-ad5a01615168 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:27:10 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Total usable vcpus: 12, total allocated vcpus: 2 {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 18 16:27:10 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Final resource view: name=user phys_ram=16023MB used_ram=768MB phys_disk=40GB used_disk=2GB total_vcpus=12 used_vcpus=2 pci_stats=[] {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 18 16:27:10 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:27:10 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:27:10 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Compute_service record updated for user:user {{(pid=70975) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 18 16:27:10 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.237s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:27:10 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:27:12 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:27:12 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:27:13 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:27:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:27:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:27:21 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:27:22 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:27:23 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:27:27 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:27:32 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:27:37 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:27:37 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:27:42 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:27:42 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:27:42 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=70975) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 18 16:27:42 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70975) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 18 16:27:42 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70975) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 18 16:27:42 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:27:47 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:27:52 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:27:52 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:27:52 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=70975) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 18 16:27:52 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70975) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 18 16:27:52 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70975) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 18 16:27:52 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:27:57 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:27:57 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:27:57 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=70975) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 18 16:27:57 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70975) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 18 16:27:57 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70975) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 18 16:27:57 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:28:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:28:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:28:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=70975) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 18 16:28:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70975) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 18 16:28:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70975) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 18 16:28:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:28:03 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:28:05 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:28:05 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:28:05 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70975) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 18 16:28:07 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:28:07 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:28:07 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:28:07 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=70975) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 18 16:28:07 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70975) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 18 16:28:07 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70975) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 18 16:28:07 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:28:09 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:28:09 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager.update_available_resource {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:28:09 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:28:09 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:28:09 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:28:09 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Auditing locally available compute resources for user (node: user) {{(pid=70975) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 18 16:28:09 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:28:09 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:28:09 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:28:09 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:28:09 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2e71dfcf-8ffe-47de-9922-ad5a01615168/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:28:09 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2e71dfcf-8ffe-47de-9922-ad5a01615168/disk --force-share --output=json" returned: 0 in 0.128s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:28:09 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2e71dfcf-8ffe-47de-9922-ad5a01615168/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:28:10 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2e71dfcf-8ffe-47de-9922-ad5a01615168/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:28:10 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:28:10 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:28:10 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Hypervisor/Node resource view: name=user free_ram=9015MB free_disk=26.528369903564453GB free_vcpus=10 pci_devices=[{"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70975) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 18 16:28:10 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:28:10 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:28:10 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 66df9389-d007-4737-8bb1-55bcb5f227ff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:28:10 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 2e71dfcf-8ffe-47de-9922-ad5a01615168 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:28:10 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Total usable vcpus: 12, total allocated vcpus: 2 {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 18 16:28:10 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Final resource view: name=user phys_ram=16023MB used_ram=768MB phys_disk=40GB used_disk=2GB total_vcpus=12 used_vcpus=2 pci_stats=[] {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 18 16:28:10 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:28:10 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:28:10 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Compute_service record updated for user:user {{(pid=70975) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 18 16:28:10 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.206s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:28:11 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:28:11 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Starting heal instance info cache {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 18 16:28:11 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "refresh_cache-2e71dfcf-8ffe-47de-9922-ad5a01615168" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:28:11 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquired lock "refresh_cache-2e71dfcf-8ffe-47de-9922-ad5a01615168" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:28:11 user nova-compute[70975]: DEBUG nova.network.neutron [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Forcefully refreshing network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 18 16:28:12 user nova-compute[70975]: DEBUG nova.network.neutron [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Updating instance_info_cache with network_info: [{"id": "a625f474-3393-4722-8b87-e0fcff38705b", "address": "fa:16:3e:35:85:e4", "network": {"id": "1cb85bc6-49f1-46be-91c8-d814b48c2a99", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1818136803-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "00d4b993c13b46ea8f80b0caed60a373", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa625f474-33", "ovs_interfaceid": "a625f474-3393-4722-8b87-e0fcff38705b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:28:12 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Releasing lock "refresh_cache-2e71dfcf-8ffe-47de-9922-ad5a01615168" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:28:12 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Updated the network info_cache for instance {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 18 16:28:12 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:28:12 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:28:12 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:28:14 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:28:17 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:28:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:28:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:28:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=70975) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 18 16:28:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70975) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 18 16:28:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70975) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 18 16:28:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:28:22 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:28:27 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:28:32 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:28:37 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-24a91138-43db-42d8-b1c1-df81fa3b53da tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Acquiring lock "2e71dfcf-8ffe-47de-9922-ad5a01615168" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:28:37 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-24a91138-43db-42d8-b1c1-df81fa3b53da tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lock "2e71dfcf-8ffe-47de-9922-ad5a01615168" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.002s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:28:37 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-24a91138-43db-42d8-b1c1-df81fa3b53da tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Acquiring lock "2e71dfcf-8ffe-47de-9922-ad5a01615168-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:28:37 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-24a91138-43db-42d8-b1c1-df81fa3b53da tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lock "2e71dfcf-8ffe-47de-9922-ad5a01615168-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:28:37 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-24a91138-43db-42d8-b1c1-df81fa3b53da tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lock "2e71dfcf-8ffe-47de-9922-ad5a01615168-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:28:37 user nova-compute[70975]: INFO nova.compute.manager [None req-24a91138-43db-42d8-b1c1-df81fa3b53da tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Terminating instance Apr 18 16:28:37 user nova-compute[70975]: DEBUG nova.compute.manager [None req-24a91138-43db-42d8-b1c1-df81fa3b53da tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Start destroying the instance on the hypervisor. {{(pid=70975) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 18 16:28:37 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:28:37 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:28:37 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:28:37 user nova-compute[70975]: DEBUG nova.compute.manager [req-31c0c009-fe2e-4f20-927b-11eb83fed2e9 req-7c14e14d-0abd-41fe-9dcd-f66c2e354231 service nova] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Received event network-vif-unplugged-a625f474-3393-4722-8b87-e0fcff38705b {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:28:37 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-31c0c009-fe2e-4f20-927b-11eb83fed2e9 req-7c14e14d-0abd-41fe-9dcd-f66c2e354231 service nova] Acquiring lock "2e71dfcf-8ffe-47de-9922-ad5a01615168-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:28:37 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-31c0c009-fe2e-4f20-927b-11eb83fed2e9 req-7c14e14d-0abd-41fe-9dcd-f66c2e354231 service nova] Lock "2e71dfcf-8ffe-47de-9922-ad5a01615168-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:28:37 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-31c0c009-fe2e-4f20-927b-11eb83fed2e9 req-7c14e14d-0abd-41fe-9dcd-f66c2e354231 service nova] Lock "2e71dfcf-8ffe-47de-9922-ad5a01615168-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:28:37 user nova-compute[70975]: DEBUG nova.compute.manager [req-31c0c009-fe2e-4f20-927b-11eb83fed2e9 req-7c14e14d-0abd-41fe-9dcd-f66c2e354231 service nova] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] No waiting events found dispatching network-vif-unplugged-a625f474-3393-4722-8b87-e0fcff38705b {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:28:37 user nova-compute[70975]: DEBUG nova.compute.manager [req-31c0c009-fe2e-4f20-927b-11eb83fed2e9 req-7c14e14d-0abd-41fe-9dcd-f66c2e354231 service nova] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Received event network-vif-unplugged-a625f474-3393-4722-8b87-e0fcff38705b for instance with task_state deleting. {{(pid=70975) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 18 16:28:37 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:28:37 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:28:37 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:28:37 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:28:37 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Instance destroyed successfully. Apr 18 16:28:37 user nova-compute[70975]: DEBUG nova.objects.instance [None req-24a91138-43db-42d8-b1c1-df81fa3b53da tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lazy-loading 'resources' on Instance uuid 2e71dfcf-8ffe-47de-9922-ad5a01615168 {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:28:37 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-24a91138-43db-42d8-b1c1-df81fa3b53da tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:26:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-645939165',display_name='tempest-TestMinimumBasicScenario-server-645939165',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-645939165',id=25,image_ref='ec984284-e2e2-460c-8d57-fd257eb433e3',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFTjzjFXAU6TCf5cUZ4HAW/uTn6Y3TUbR85mBRl2+FlNM0NVwWI69f9l4ZtVvR9kFoEORm5vcNFscXnnbdpx5zWg+f/sXB25kjR5e5QBGQL42u3J1dK+/RPVXZi6Tm2BzQ==',key_name='tempest-TestMinimumBasicScenario-1240941080',keypairs=,launch_index=0,launched_at=2023-04-18T16:26:51Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='00d4b993c13b46ea8f80b0caed60a373',ramdisk_id='',reservation_id='r-dt6ukame',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ec984284-e2e2-460c-8d57-fd257eb433e3',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestMinimumBasicScenario-386776323',owner_user_name='tempest-TestMinimumBasicScenario-386776323-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-18T16:26:51Z,user_data=None,user_id='add4c9d906ba49a590f203e0aa98ab64',uuid=2e71dfcf-8ffe-47de-9922-ad5a01615168,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a625f474-3393-4722-8b87-e0fcff38705b", "address": "fa:16:3e:35:85:e4", "network": {"id": "1cb85bc6-49f1-46be-91c8-d814b48c2a99", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1818136803-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "00d4b993c13b46ea8f80b0caed60a373", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa625f474-33", "ovs_interfaceid": "a625f474-3393-4722-8b87-e0fcff38705b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 18 16:28:37 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-24a91138-43db-42d8-b1c1-df81fa3b53da tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Converting VIF {"id": "a625f474-3393-4722-8b87-e0fcff38705b", "address": "fa:16:3e:35:85:e4", "network": {"id": "1cb85bc6-49f1-46be-91c8-d814b48c2a99", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1818136803-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "00d4b993c13b46ea8f80b0caed60a373", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa625f474-33", "ovs_interfaceid": "a625f474-3393-4722-8b87-e0fcff38705b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:28:37 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-24a91138-43db-42d8-b1c1-df81fa3b53da tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:35:85:e4,bridge_name='br-int',has_traffic_filtering=True,id=a625f474-3393-4722-8b87-e0fcff38705b,network=Network(1cb85bc6-49f1-46be-91c8-d814b48c2a99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa625f474-33') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:28:37 user nova-compute[70975]: DEBUG os_vif [None req-24a91138-43db-42d8-b1c1-df81fa3b53da tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:35:85:e4,bridge_name='br-int',has_traffic_filtering=True,id=a625f474-3393-4722-8b87-e0fcff38705b,network=Network(1cb85bc6-49f1-46be-91c8-d814b48c2a99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa625f474-33') {{(pid=70975) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 18 16:28:37 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:28:37 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa625f474-33, bridge=br-int, if_exists=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:28:37 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:28:37 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:28:37 user nova-compute[70975]: INFO os_vif [None req-24a91138-43db-42d8-b1c1-df81fa3b53da tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:35:85:e4,bridge_name='br-int',has_traffic_filtering=True,id=a625f474-3393-4722-8b87-e0fcff38705b,network=Network(1cb85bc6-49f1-46be-91c8-d814b48c2a99),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa625f474-33') Apr 18 16:28:37 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-24a91138-43db-42d8-b1c1-df81fa3b53da tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Deleting instance files /opt/stack/data/nova/instances/2e71dfcf-8ffe-47de-9922-ad5a01615168_del Apr 18 16:28:37 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-24a91138-43db-42d8-b1c1-df81fa3b53da tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Deletion of /opt/stack/data/nova/instances/2e71dfcf-8ffe-47de-9922-ad5a01615168_del complete Apr 18 16:28:37 user nova-compute[70975]: INFO nova.compute.manager [None req-24a91138-43db-42d8-b1c1-df81fa3b53da tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Took 0.66 seconds to destroy the instance on the hypervisor. Apr 18 16:28:37 user nova-compute[70975]: DEBUG oslo.service.loopingcall [None req-24a91138-43db-42d8-b1c1-df81fa3b53da tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70975) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 18 16:28:37 user nova-compute[70975]: DEBUG nova.compute.manager [-] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Deallocating network for instance {{(pid=70975) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 18 16:28:37 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] deallocate_for_instance() {{(pid=70975) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 18 16:28:38 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:28:38 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:28:38 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:28:38 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:28:38 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:28:38 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:28:38 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Updating instance_info_cache with network_info: [] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:28:38 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Took 0.75 seconds to deallocate network for instance. Apr 18 16:28:38 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-24a91138-43db-42d8-b1c1-df81fa3b53da tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:28:38 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-24a91138-43db-42d8-b1c1-df81fa3b53da tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:28:38 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-24a91138-43db-42d8-b1c1-df81fa3b53da tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:28:38 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-24a91138-43db-42d8-b1c1-df81fa3b53da tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:28:38 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-24a91138-43db-42d8-b1c1-df81fa3b53da tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.133s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:28:38 user nova-compute[70975]: INFO nova.scheduler.client.report [None req-24a91138-43db-42d8-b1c1-df81fa3b53da tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Deleted allocations for instance 2e71dfcf-8ffe-47de-9922-ad5a01615168 Apr 18 16:28:38 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-24a91138-43db-42d8-b1c1-df81fa3b53da tempest-TestMinimumBasicScenario-386776323 tempest-TestMinimumBasicScenario-386776323-project-member] Lock "2e71dfcf-8ffe-47de-9922-ad5a01615168" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.714s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:28:39 user nova-compute[70975]: DEBUG nova.compute.manager [req-744b2252-a505-4305-a533-8f9eefc512e9 req-804d3808-2210-46a4-bcc8-0265b4e938c6 service nova] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Received event network-vif-plugged-a625f474-3393-4722-8b87-e0fcff38705b {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:28:39 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-744b2252-a505-4305-a533-8f9eefc512e9 req-804d3808-2210-46a4-bcc8-0265b4e938c6 service nova] Acquiring lock "2e71dfcf-8ffe-47de-9922-ad5a01615168-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:28:39 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-744b2252-a505-4305-a533-8f9eefc512e9 req-804d3808-2210-46a4-bcc8-0265b4e938c6 service nova] Lock "2e71dfcf-8ffe-47de-9922-ad5a01615168-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:28:39 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-744b2252-a505-4305-a533-8f9eefc512e9 req-804d3808-2210-46a4-bcc8-0265b4e938c6 service nova] Lock "2e71dfcf-8ffe-47de-9922-ad5a01615168-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:28:39 user nova-compute[70975]: DEBUG nova.compute.manager [req-744b2252-a505-4305-a533-8f9eefc512e9 req-804d3808-2210-46a4-bcc8-0265b4e938c6 service nova] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] No waiting events found dispatching network-vif-plugged-a625f474-3393-4722-8b87-e0fcff38705b {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:28:39 user nova-compute[70975]: WARNING nova.compute.manager [req-744b2252-a505-4305-a533-8f9eefc512e9 req-804d3808-2210-46a4-bcc8-0265b4e938c6 service nova] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Received unexpected event network-vif-plugged-a625f474-3393-4722-8b87-e0fcff38705b for instance with vm_state deleted and task_state None. Apr 18 16:28:39 user nova-compute[70975]: DEBUG nova.compute.manager [req-744b2252-a505-4305-a533-8f9eefc512e9 req-804d3808-2210-46a4-bcc8-0265b4e938c6 service nova] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Received event network-vif-plugged-a625f474-3393-4722-8b87-e0fcff38705b {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:28:39 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-744b2252-a505-4305-a533-8f9eefc512e9 req-804d3808-2210-46a4-bcc8-0265b4e938c6 service nova] Acquiring lock "2e71dfcf-8ffe-47de-9922-ad5a01615168-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:28:39 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-744b2252-a505-4305-a533-8f9eefc512e9 req-804d3808-2210-46a4-bcc8-0265b4e938c6 service nova] Lock "2e71dfcf-8ffe-47de-9922-ad5a01615168-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:28:39 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-744b2252-a505-4305-a533-8f9eefc512e9 req-804d3808-2210-46a4-bcc8-0265b4e938c6 service nova] Lock "2e71dfcf-8ffe-47de-9922-ad5a01615168-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:28:39 user nova-compute[70975]: DEBUG nova.compute.manager [req-744b2252-a505-4305-a533-8f9eefc512e9 req-804d3808-2210-46a4-bcc8-0265b4e938c6 service nova] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] No waiting events found dispatching network-vif-plugged-a625f474-3393-4722-8b87-e0fcff38705b {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:28:39 user nova-compute[70975]: WARNING nova.compute.manager [req-744b2252-a505-4305-a533-8f9eefc512e9 req-804d3808-2210-46a4-bcc8-0265b4e938c6 service nova] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Received unexpected event network-vif-plugged-a625f474-3393-4722-8b87-e0fcff38705b for instance with vm_state deleted and task_state None. Apr 18 16:28:39 user nova-compute[70975]: DEBUG nova.compute.manager [req-744b2252-a505-4305-a533-8f9eefc512e9 req-804d3808-2210-46a4-bcc8-0265b4e938c6 service nova] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Received event network-vif-plugged-a625f474-3393-4722-8b87-e0fcff38705b {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:28:39 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-744b2252-a505-4305-a533-8f9eefc512e9 req-804d3808-2210-46a4-bcc8-0265b4e938c6 service nova] Acquiring lock "2e71dfcf-8ffe-47de-9922-ad5a01615168-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:28:39 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-744b2252-a505-4305-a533-8f9eefc512e9 req-804d3808-2210-46a4-bcc8-0265b4e938c6 service nova] Lock "2e71dfcf-8ffe-47de-9922-ad5a01615168-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:28:39 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-744b2252-a505-4305-a533-8f9eefc512e9 req-804d3808-2210-46a4-bcc8-0265b4e938c6 service nova] Lock "2e71dfcf-8ffe-47de-9922-ad5a01615168-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:28:39 user nova-compute[70975]: DEBUG nova.compute.manager [req-744b2252-a505-4305-a533-8f9eefc512e9 req-804d3808-2210-46a4-bcc8-0265b4e938c6 service nova] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] No waiting events found dispatching network-vif-plugged-a625f474-3393-4722-8b87-e0fcff38705b {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:28:39 user nova-compute[70975]: WARNING nova.compute.manager [req-744b2252-a505-4305-a533-8f9eefc512e9 req-804d3808-2210-46a4-bcc8-0265b4e938c6 service nova] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Received unexpected event network-vif-plugged-a625f474-3393-4722-8b87-e0fcff38705b for instance with vm_state deleted and task_state None. Apr 18 16:28:39 user nova-compute[70975]: DEBUG nova.compute.manager [req-744b2252-a505-4305-a533-8f9eefc512e9 req-804d3808-2210-46a4-bcc8-0265b4e938c6 service nova] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Received event network-vif-plugged-a625f474-3393-4722-8b87-e0fcff38705b {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:28:39 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-744b2252-a505-4305-a533-8f9eefc512e9 req-804d3808-2210-46a4-bcc8-0265b4e938c6 service nova] Acquiring lock "2e71dfcf-8ffe-47de-9922-ad5a01615168-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:28:39 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-744b2252-a505-4305-a533-8f9eefc512e9 req-804d3808-2210-46a4-bcc8-0265b4e938c6 service nova] Lock "2e71dfcf-8ffe-47de-9922-ad5a01615168-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:28:39 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-744b2252-a505-4305-a533-8f9eefc512e9 req-804d3808-2210-46a4-bcc8-0265b4e938c6 service nova] Lock "2e71dfcf-8ffe-47de-9922-ad5a01615168-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:28:39 user nova-compute[70975]: DEBUG nova.compute.manager [req-744b2252-a505-4305-a533-8f9eefc512e9 req-804d3808-2210-46a4-bcc8-0265b4e938c6 service nova] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] No waiting events found dispatching network-vif-plugged-a625f474-3393-4722-8b87-e0fcff38705b {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:28:39 user nova-compute[70975]: WARNING nova.compute.manager [req-744b2252-a505-4305-a533-8f9eefc512e9 req-804d3808-2210-46a4-bcc8-0265b4e938c6 service nova] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Received unexpected event network-vif-plugged-a625f474-3393-4722-8b87-e0fcff38705b for instance with vm_state deleted and task_state None. Apr 18 16:28:39 user nova-compute[70975]: DEBUG nova.compute.manager [req-744b2252-a505-4305-a533-8f9eefc512e9 req-804d3808-2210-46a4-bcc8-0265b4e938c6 service nova] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Received event network-vif-deleted-a625f474-3393-4722-8b87-e0fcff38705b {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:28:42 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:28:47 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:28:52 user nova-compute[70975]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:28:52 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] VM Stopped (Lifecycle Event) Apr 18 16:28:52 user nova-compute[70975]: DEBUG nova.compute.manager [None req-bb48125e-59a9-495a-b4c5-7c7df3ca027f None None] [instance: 2e71dfcf-8ffe-47de-9922-ad5a01615168] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:28:52 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:28:57 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:28:57 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:28:57 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=70975) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 18 16:28:57 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70975) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 18 16:28:57 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70975) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 18 16:28:57 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:29:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:29:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:29:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=70975) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 18 16:29:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70975) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 18 16:29:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70975) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 18 16:29:02 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:29:03 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:29:06 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:29:06 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70975) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 18 16:29:07 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:29:07 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:29:08 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:29:09 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:29:10 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:29:10 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Starting heal instance info cache {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 18 16:29:10 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Rebuilding the list of instances to heal {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 18 16:29:10 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "refresh_cache-66df9389-d007-4737-8bb1-55bcb5f227ff" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 18 16:29:10 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquired lock "refresh_cache-66df9389-d007-4737-8bb1-55bcb5f227ff" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 18 16:29:10 user nova-compute[70975]: DEBUG nova.network.neutron [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Forcefully refreshing network info cache for instance {{(pid=70975) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 18 16:29:10 user nova-compute[70975]: DEBUG nova.objects.instance [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lazy-loading 'info_cache' on Instance uuid 66df9389-d007-4737-8bb1-55bcb5f227ff {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:29:11 user nova-compute[70975]: DEBUG nova.network.neutron [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Updating instance_info_cache with network_info: [{"id": "b66d41ab-873c-4826-a3f8-d4f4276fff10", "address": "fa:16:3e:58:32:25", "network": {"id": "236fa8aa-433b-4dfa-a787-f165c3389489", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1486162327-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5695adbb14ea4162bc40547b1509a1e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb66d41ab-87", "ovs_interfaceid": "b66d41ab-873c-4826-a3f8-d4f4276fff10", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:29:11 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Releasing lock "refresh_cache-66df9389-d007-4737-8bb1-55bcb5f227ff" {{(pid=70975) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 18 16:29:11 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Updated the network info_cache for instance {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 18 16:29:11 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager.update_available_resource {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:29:11 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:29:11 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:29:11 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:29:11 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Auditing locally available compute resources for user (node: user) {{(pid=70975) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 18 16:29:11 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:29:11 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:29:11 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 18 16:29:11 user nova-compute[70975]: DEBUG oslo_concurrency.processutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=70975) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 18 16:29:12 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:29:12 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:29:12 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Hypervisor/Node resource view: name=user free_ram=9052MB free_disk=26.547176361083984GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70975) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 18 16:29:12 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:29:12 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:29:12 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Instance 66df9389-d007-4737-8bb1-55bcb5f227ff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=70975) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 18 16:29:12 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 18 16:29:12 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 18 16:29:12 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:29:12 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:29:12 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Compute_service record updated for user:user {{(pid=70975) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 18 16:29:12 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.199s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:29:12 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:29:12 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:29:12 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=70975) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 18 16:29:12 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70975) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 18 16:29:12 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70975) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 18 16:29:12 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:29:13 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:29:16 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:29:17 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:29:22 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:29:22 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:29:22 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=70975) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 18 16:29:22 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70975) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 18 16:29:22 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70975) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 18 16:29:22 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:29:27 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:29:28 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-8cb584c0-612d-4cc9-afed-5504d220544e tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Acquiring lock "66df9389-d007-4737-8bb1-55bcb5f227ff" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:29:28 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-8cb584c0-612d-4cc9-afed-5504d220544e tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lock "66df9389-d007-4737-8bb1-55bcb5f227ff" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:29:28 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-8cb584c0-612d-4cc9-afed-5504d220544e tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Acquiring lock "66df9389-d007-4737-8bb1-55bcb5f227ff-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:29:28 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-8cb584c0-612d-4cc9-afed-5504d220544e tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lock "66df9389-d007-4737-8bb1-55bcb5f227ff-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:29:28 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-8cb584c0-612d-4cc9-afed-5504d220544e tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lock "66df9389-d007-4737-8bb1-55bcb5f227ff-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:29:28 user nova-compute[70975]: INFO nova.compute.manager [None req-8cb584c0-612d-4cc9-afed-5504d220544e tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Terminating instance Apr 18 16:29:28 user nova-compute[70975]: DEBUG nova.compute.manager [None req-8cb584c0-612d-4cc9-afed-5504d220544e tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Start destroying the instance on the hypervisor. {{(pid=70975) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 18 16:29:28 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:29:28 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:29:28 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:29:28 user nova-compute[70975]: DEBUG nova.compute.manager [req-b1799213-6b76-4bc7-9db4-611f1c3bca1e req-1d762318-6fbf-415c-bd80-21503652b189 service nova] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Received event network-vif-unplugged-b66d41ab-873c-4826-a3f8-d4f4276fff10 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:29:28 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-b1799213-6b76-4bc7-9db4-611f1c3bca1e req-1d762318-6fbf-415c-bd80-21503652b189 service nova] Acquiring lock "66df9389-d007-4737-8bb1-55bcb5f227ff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:29:28 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-b1799213-6b76-4bc7-9db4-611f1c3bca1e req-1d762318-6fbf-415c-bd80-21503652b189 service nova] Lock "66df9389-d007-4737-8bb1-55bcb5f227ff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:29:28 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-b1799213-6b76-4bc7-9db4-611f1c3bca1e req-1d762318-6fbf-415c-bd80-21503652b189 service nova] Lock "66df9389-d007-4737-8bb1-55bcb5f227ff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:29:28 user nova-compute[70975]: DEBUG nova.compute.manager [req-b1799213-6b76-4bc7-9db4-611f1c3bca1e req-1d762318-6fbf-415c-bd80-21503652b189 service nova] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] No waiting events found dispatching network-vif-unplugged-b66d41ab-873c-4826-a3f8-d4f4276fff10 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:29:28 user nova-compute[70975]: DEBUG nova.compute.manager [req-b1799213-6b76-4bc7-9db4-611f1c3bca1e req-1d762318-6fbf-415c-bd80-21503652b189 service nova] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Received event network-vif-unplugged-b66d41ab-873c-4826-a3f8-d4f4276fff10 for instance with task_state deleting. {{(pid=70975) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 18 16:29:28 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:29:29 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:29:29 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:29:29 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:29:29 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:29:29 user nova-compute[70975]: INFO nova.virt.libvirt.driver [-] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Instance destroyed successfully. Apr 18 16:29:29 user nova-compute[70975]: DEBUG nova.objects.instance [None req-8cb584c0-612d-4cc9-afed-5504d220544e tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lazy-loading 'resources' on Instance uuid 66df9389-d007-4737-8bb1-55bcb5f227ff {{(pid=70975) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 18 16:29:29 user nova-compute[70975]: DEBUG nova.virt.libvirt.vif [None req-8cb584c0-612d-4cc9-afed-5504d220544e tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-18T16:20:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1356234404',display_name='tempest-ServersNegativeTestJSON-server-1356234404',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1356234404',id=19,image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-18T16:20:59Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='5695adbb14ea4162bc40547b1509a1e4',ramdisk_id='',reservation_id='r-9ymcy2v8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b11a20de-f82a-4158-b53e-0a0c7a1552cb',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServersNegativeTestJSON-1696086909',owner_user_name='tempest-ServersNegativeTestJSON-1696086909-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-18T16:20:59Z,user_data=None,user_id='2963911de4f34d79816a9a1f9ad24a27',uuid=66df9389-d007-4737-8bb1-55bcb5f227ff,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b66d41ab-873c-4826-a3f8-d4f4276fff10", "address": "fa:16:3e:58:32:25", "network": {"id": "236fa8aa-433b-4dfa-a787-f165c3389489", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1486162327-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5695adbb14ea4162bc40547b1509a1e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb66d41ab-87", "ovs_interfaceid": "b66d41ab-873c-4826-a3f8-d4f4276fff10", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 18 16:29:29 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-8cb584c0-612d-4cc9-afed-5504d220544e tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Converting VIF {"id": "b66d41ab-873c-4826-a3f8-d4f4276fff10", "address": "fa:16:3e:58:32:25", "network": {"id": "236fa8aa-433b-4dfa-a787-f165c3389489", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1486162327-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "5695adbb14ea4162bc40547b1509a1e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb66d41ab-87", "ovs_interfaceid": "b66d41ab-873c-4826-a3f8-d4f4276fff10", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 18 16:29:29 user nova-compute[70975]: DEBUG nova.network.os_vif_util [None req-8cb584c0-612d-4cc9-afed-5504d220544e tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:58:32:25,bridge_name='br-int',has_traffic_filtering=True,id=b66d41ab-873c-4826-a3f8-d4f4276fff10,network=Network(236fa8aa-433b-4dfa-a787-f165c3389489),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb66d41ab-87') {{(pid=70975) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 18 16:29:29 user nova-compute[70975]: DEBUG os_vif [None req-8cb584c0-612d-4cc9-afed-5504d220544e tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:58:32:25,bridge_name='br-int',has_traffic_filtering=True,id=b66d41ab-873c-4826-a3f8-d4f4276fff10,network=Network(236fa8aa-433b-4dfa-a787-f165c3389489),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb66d41ab-87') {{(pid=70975) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 18 16:29:29 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:29:29 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb66d41ab-87, bridge=br-int, if_exists=True) {{(pid=70975) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 18 16:29:29 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:29:29 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:29:29 user nova-compute[70975]: INFO os_vif [None req-8cb584c0-612d-4cc9-afed-5504d220544e tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:58:32:25,bridge_name='br-int',has_traffic_filtering=True,id=b66d41ab-873c-4826-a3f8-d4f4276fff10,network=Network(236fa8aa-433b-4dfa-a787-f165c3389489),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb66d41ab-87') Apr 18 16:29:29 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-8cb584c0-612d-4cc9-afed-5504d220544e tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Deleting instance files /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff_del Apr 18 16:29:29 user nova-compute[70975]: INFO nova.virt.libvirt.driver [None req-8cb584c0-612d-4cc9-afed-5504d220544e tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Deletion of /opt/stack/data/nova/instances/66df9389-d007-4737-8bb1-55bcb5f227ff_del complete Apr 18 16:29:29 user nova-compute[70975]: INFO nova.compute.manager [None req-8cb584c0-612d-4cc9-afed-5504d220544e tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Took 0.88 seconds to destroy the instance on the hypervisor. Apr 18 16:29:29 user nova-compute[70975]: DEBUG oslo.service.loopingcall [None req-8cb584c0-612d-4cc9-afed-5504d220544e tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70975) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 18 16:29:29 user nova-compute[70975]: DEBUG nova.compute.manager [-] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Deallocating network for instance {{(pid=70975) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 18 16:29:29 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] deallocate_for_instance() {{(pid=70975) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 18 16:29:29 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:29:29 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:29:29 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:29:29 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:29:30 user nova-compute[70975]: DEBUG nova.network.neutron [-] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Updating instance_info_cache with network_info: [] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:29:30 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Took 0.72 seconds to deallocate network for instance. Apr 18 16:29:30 user nova-compute[70975]: DEBUG nova.compute.manager [req-2c5fc940-2e4d-499f-9bc6-4fd82ba1c66d req-47649bdf-1f64-4f8c-8254-ecb51ad6dfe2 service nova] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Received event network-vif-deleted-b66d41ab-873c-4826-a3f8-d4f4276fff10 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:29:30 user nova-compute[70975]: INFO nova.compute.manager [req-2c5fc940-2e4d-499f-9bc6-4fd82ba1c66d req-47649bdf-1f64-4f8c-8254-ecb51ad6dfe2 service nova] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Neutron deleted interface b66d41ab-873c-4826-a3f8-d4f4276fff10; detaching it from the instance and deleting it from the info cache Apr 18 16:29:30 user nova-compute[70975]: DEBUG nova.network.neutron [req-2c5fc940-2e4d-499f-9bc6-4fd82ba1c66d req-47649bdf-1f64-4f8c-8254-ecb51ad6dfe2 service nova] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Updating instance_info_cache with network_info: [] {{(pid=70975) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 18 16:29:30 user nova-compute[70975]: DEBUG nova.compute.manager [req-2c5fc940-2e4d-499f-9bc6-4fd82ba1c66d req-47649bdf-1f64-4f8c-8254-ecb51ad6dfe2 service nova] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Detach interface failed, port_id=b66d41ab-873c-4826-a3f8-d4f4276fff10, reason: Instance 66df9389-d007-4737-8bb1-55bcb5f227ff could not be found. {{(pid=70975) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 18 16:29:30 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-8cb584c0-612d-4cc9-afed-5504d220544e tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:29:30 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-8cb584c0-612d-4cc9-afed-5504d220544e tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:29:30 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-8cb584c0-612d-4cc9-afed-5504d220544e tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:29:30 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-8cb584c0-612d-4cc9-afed-5504d220544e tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:29:30 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-8cb584c0-612d-4cc9-afed-5504d220544e tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.106s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:29:30 user nova-compute[70975]: INFO nova.scheduler.client.report [None req-8cb584c0-612d-4cc9-afed-5504d220544e tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Deleted allocations for instance 66df9389-d007-4737-8bb1-55bcb5f227ff Apr 18 16:29:30 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-8cb584c0-612d-4cc9-afed-5504d220544e tempest-ServersNegativeTestJSON-1696086909 tempest-ServersNegativeTestJSON-1696086909-project-member] Lock "66df9389-d007-4737-8bb1-55bcb5f227ff" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.889s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:29:30 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:29:30 user nova-compute[70975]: DEBUG nova.compute.manager [req-e4b933e4-6ed7-4a41-b072-4c29546c668f req-d558802b-02aa-48b9-ac91-c603c6479734 service nova] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Received event network-vif-plugged-b66d41ab-873c-4826-a3f8-d4f4276fff10 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:29:30 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-e4b933e4-6ed7-4a41-b072-4c29546c668f req-d558802b-02aa-48b9-ac91-c603c6479734 service nova] Acquiring lock "66df9389-d007-4737-8bb1-55bcb5f227ff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:29:30 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-e4b933e4-6ed7-4a41-b072-4c29546c668f req-d558802b-02aa-48b9-ac91-c603c6479734 service nova] Lock "66df9389-d007-4737-8bb1-55bcb5f227ff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:29:30 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-e4b933e4-6ed7-4a41-b072-4c29546c668f req-d558802b-02aa-48b9-ac91-c603c6479734 service nova] Lock "66df9389-d007-4737-8bb1-55bcb5f227ff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:29:30 user nova-compute[70975]: DEBUG nova.compute.manager [req-e4b933e4-6ed7-4a41-b072-4c29546c668f req-d558802b-02aa-48b9-ac91-c603c6479734 service nova] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] No waiting events found dispatching network-vif-plugged-b66d41ab-873c-4826-a3f8-d4f4276fff10 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:29:30 user nova-compute[70975]: WARNING nova.compute.manager [req-e4b933e4-6ed7-4a41-b072-4c29546c668f req-d558802b-02aa-48b9-ac91-c603c6479734 service nova] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Received unexpected event network-vif-plugged-b66d41ab-873c-4826-a3f8-d4f4276fff10 for instance with vm_state deleted and task_state None. Apr 18 16:29:30 user nova-compute[70975]: DEBUG nova.compute.manager [req-e4b933e4-6ed7-4a41-b072-4c29546c668f req-d558802b-02aa-48b9-ac91-c603c6479734 service nova] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Received event network-vif-plugged-b66d41ab-873c-4826-a3f8-d4f4276fff10 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:29:30 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-e4b933e4-6ed7-4a41-b072-4c29546c668f req-d558802b-02aa-48b9-ac91-c603c6479734 service nova] Acquiring lock "66df9389-d007-4737-8bb1-55bcb5f227ff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:29:30 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-e4b933e4-6ed7-4a41-b072-4c29546c668f req-d558802b-02aa-48b9-ac91-c603c6479734 service nova] Lock "66df9389-d007-4737-8bb1-55bcb5f227ff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:29:30 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-e4b933e4-6ed7-4a41-b072-4c29546c668f req-d558802b-02aa-48b9-ac91-c603c6479734 service nova] Lock "66df9389-d007-4737-8bb1-55bcb5f227ff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:29:30 user nova-compute[70975]: DEBUG nova.compute.manager [req-e4b933e4-6ed7-4a41-b072-4c29546c668f req-d558802b-02aa-48b9-ac91-c603c6479734 service nova] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] No waiting events found dispatching network-vif-plugged-b66d41ab-873c-4826-a3f8-d4f4276fff10 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:29:30 user nova-compute[70975]: WARNING nova.compute.manager [req-e4b933e4-6ed7-4a41-b072-4c29546c668f req-d558802b-02aa-48b9-ac91-c603c6479734 service nova] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Received unexpected event network-vif-plugged-b66d41ab-873c-4826-a3f8-d4f4276fff10 for instance with vm_state deleted and task_state None. Apr 18 16:29:30 user nova-compute[70975]: DEBUG nova.compute.manager [req-e4b933e4-6ed7-4a41-b072-4c29546c668f req-d558802b-02aa-48b9-ac91-c603c6479734 service nova] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Received event network-vif-plugged-b66d41ab-873c-4826-a3f8-d4f4276fff10 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:29:30 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-e4b933e4-6ed7-4a41-b072-4c29546c668f req-d558802b-02aa-48b9-ac91-c603c6479734 service nova] Acquiring lock "66df9389-d007-4737-8bb1-55bcb5f227ff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:29:30 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-e4b933e4-6ed7-4a41-b072-4c29546c668f req-d558802b-02aa-48b9-ac91-c603c6479734 service nova] Lock "66df9389-d007-4737-8bb1-55bcb5f227ff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:29:30 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-e4b933e4-6ed7-4a41-b072-4c29546c668f req-d558802b-02aa-48b9-ac91-c603c6479734 service nova] Lock "66df9389-d007-4737-8bb1-55bcb5f227ff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:29:30 user nova-compute[70975]: DEBUG nova.compute.manager [req-e4b933e4-6ed7-4a41-b072-4c29546c668f req-d558802b-02aa-48b9-ac91-c603c6479734 service nova] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] No waiting events found dispatching network-vif-plugged-b66d41ab-873c-4826-a3f8-d4f4276fff10 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:29:30 user nova-compute[70975]: WARNING nova.compute.manager [req-e4b933e4-6ed7-4a41-b072-4c29546c668f req-d558802b-02aa-48b9-ac91-c603c6479734 service nova] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Received unexpected event network-vif-plugged-b66d41ab-873c-4826-a3f8-d4f4276fff10 for instance with vm_state deleted and task_state None. Apr 18 16:29:30 user nova-compute[70975]: DEBUG nova.compute.manager [req-e4b933e4-6ed7-4a41-b072-4c29546c668f req-d558802b-02aa-48b9-ac91-c603c6479734 service nova] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Received event network-vif-unplugged-b66d41ab-873c-4826-a3f8-d4f4276fff10 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:29:30 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-e4b933e4-6ed7-4a41-b072-4c29546c668f req-d558802b-02aa-48b9-ac91-c603c6479734 service nova] Acquiring lock "66df9389-d007-4737-8bb1-55bcb5f227ff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:29:30 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-e4b933e4-6ed7-4a41-b072-4c29546c668f req-d558802b-02aa-48b9-ac91-c603c6479734 service nova] Lock "66df9389-d007-4737-8bb1-55bcb5f227ff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:29:30 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-e4b933e4-6ed7-4a41-b072-4c29546c668f req-d558802b-02aa-48b9-ac91-c603c6479734 service nova] Lock "66df9389-d007-4737-8bb1-55bcb5f227ff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:29:30 user nova-compute[70975]: DEBUG nova.compute.manager [req-e4b933e4-6ed7-4a41-b072-4c29546c668f req-d558802b-02aa-48b9-ac91-c603c6479734 service nova] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] No waiting events found dispatching network-vif-unplugged-b66d41ab-873c-4826-a3f8-d4f4276fff10 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:29:30 user nova-compute[70975]: WARNING nova.compute.manager [req-e4b933e4-6ed7-4a41-b072-4c29546c668f req-d558802b-02aa-48b9-ac91-c603c6479734 service nova] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Received unexpected event network-vif-unplugged-b66d41ab-873c-4826-a3f8-d4f4276fff10 for instance with vm_state deleted and task_state None. Apr 18 16:29:30 user nova-compute[70975]: DEBUG nova.compute.manager [req-e4b933e4-6ed7-4a41-b072-4c29546c668f req-d558802b-02aa-48b9-ac91-c603c6479734 service nova] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Received event network-vif-plugged-b66d41ab-873c-4826-a3f8-d4f4276fff10 {{(pid=70975) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 18 16:29:30 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-e4b933e4-6ed7-4a41-b072-4c29546c668f req-d558802b-02aa-48b9-ac91-c603c6479734 service nova] Acquiring lock "66df9389-d007-4737-8bb1-55bcb5f227ff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:29:30 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-e4b933e4-6ed7-4a41-b072-4c29546c668f req-d558802b-02aa-48b9-ac91-c603c6479734 service nova] Lock "66df9389-d007-4737-8bb1-55bcb5f227ff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:29:30 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [req-e4b933e4-6ed7-4a41-b072-4c29546c668f req-d558802b-02aa-48b9-ac91-c603c6479734 service nova] Lock "66df9389-d007-4737-8bb1-55bcb5f227ff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:29:30 user nova-compute[70975]: DEBUG nova.compute.manager [req-e4b933e4-6ed7-4a41-b072-4c29546c668f req-d558802b-02aa-48b9-ac91-c603c6479734 service nova] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] No waiting events found dispatching network-vif-plugged-b66d41ab-873c-4826-a3f8-d4f4276fff10 {{(pid=70975) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 18 16:29:30 user nova-compute[70975]: WARNING nova.compute.manager [req-e4b933e4-6ed7-4a41-b072-4c29546c668f req-d558802b-02aa-48b9-ac91-c603c6479734 service nova] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Received unexpected event network-vif-plugged-b66d41ab-873c-4826-a3f8-d4f4276fff10 for instance with vm_state deleted and task_state None. Apr 18 16:29:34 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:29:34 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:29:39 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:29:44 user nova-compute[70975]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70975) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 18 16:29:44 user nova-compute[70975]: INFO nova.compute.manager [-] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] VM Stopped (Lifecycle Event) Apr 18 16:29:44 user nova-compute[70975]: DEBUG nova.compute.manager [None req-6ef839ed-cf0d-48d5-be2e-d1d3637b22ba None None] [instance: 66df9389-d007-4737-8bb1-55bcb5f227ff] Checking state {{(pid=70975) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 18 16:29:44 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:29:49 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:29:54 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:29:54 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:29:54 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=70975) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 18 16:29:54 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70975) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 18 16:29:54 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70975) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 18 16:29:54 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:29:59 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:30:04 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:30:04 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:30:04 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=70975) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 18 16:30:04 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=70975) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 18 16:30:04 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70975) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 18 16:30:04 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:30:05 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:30:07 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:30:07 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70975) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 18 16:30:09 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:30:09 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:30:09 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:30:10 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:30:10 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Starting heal instance info cache {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 18 16:30:10 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Rebuilding the list of instances to heal {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 18 16:30:10 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Didn't find any instances for network info cache update. {{(pid=70975) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 18 16:30:10 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:30:10 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Cleaning up deleted instances with incomplete migration {{(pid=70975) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 18 16:30:11 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:30:12 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:30:12 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:30:13 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager.update_available_resource {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:30:13 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:30:13 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:30:13 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:30:13 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Auditing locally available compute resources for user (node: user) {{(pid=70975) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 18 16:30:13 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:30:13 user nova-compute[70975]: WARNING nova.virt.libvirt.driver [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 18 16:30:13 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Hypervisor/Node resource view: name=user free_ram=9228MB free_disk=26.595657348632812GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70975) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 18 16:30:13 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 18 16:30:13 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 18 16:30:14 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 18 16:30:14 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=70975) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 18 16:30:14 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Refreshing inventories for resource provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 18 16:30:14 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Updating ProviderTree inventory for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 18 16:30:14 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Updating inventory in ProviderTree for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 18 16:30:14 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Refreshing aggregate associations for resource provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9, aggregates: None {{(pid=70975) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 18 16:30:14 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Refreshing trait associations for resource provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9, traits: COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE41,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_RESCUE_BFV,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_IDE,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE42 {{(pid=70975) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 18 16:30:14 user nova-compute[70975]: DEBUG nova.compute.provider_tree [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed in ProviderTree for provider: 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 {{(pid=70975) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 18 16:30:14 user nova-compute[70975]: DEBUG nova.scheduler.client.report [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Inventory has not changed for provider 161a05c2-8402-4a6a-9ad9-6fdf826a94d9 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70975) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 18 16:30:14 user nova-compute[70975]: DEBUG nova.compute.resource_tracker [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Compute_service record updated for user:user {{(pid=70975) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 18 16:30:14 user nova-compute[70975]: DEBUG oslo_concurrency.lockutils [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.437s {{(pid=70975) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 18 16:30:14 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:30:17 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:30:19 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:30:20 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:30:20 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Cleaning up deleted instances {{(pid=70975) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 18 16:30:20 user nova-compute[70975]: DEBUG nova.compute.manager [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] There are 0 instances to clean {{(pid=70975) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 18 16:30:21 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:30:21 user nova-compute[70975]: DEBUG oslo_service.periodic_task [None req-5374b104-62e1-472c-925f-f4defab71e2c None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=70975) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 18 16:30:24 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:30:29 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:30:29 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:30:34 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 18 16:30:39 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:30:44 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:30:49 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 18 16:30:54 user nova-compute[70975]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=70975) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}}