Jul 27 09:25:56 user nova-compute[70374]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. Jul 27 09:25:59 user nova-compute[70374]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=70374) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=70374) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=70374) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} Jul 27 09:25:59 user nova-compute[70374]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.021s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:25:59 user nova-compute[70374]: INFO nova.virt.driver [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] Loading compute driver 'libvirt.LibvirtDriver' Jul 27 09:25:59 user nova-compute[70374]: INFO nova.compute.provider_config [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] Acquiring lock "singleton_lock" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] Acquired lock "singleton_lock" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] Releasing lock "singleton_lock" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] Full set of CONF: {{(pid=70374) _wait_for_exit_or_signal /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:362}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] ******************************************************************************** {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] Configuration options gathered from: {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] command line args: ['--config-file', '/etc/nova/nova-cpu.conf'] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] config files: ['/etc/nova/nova-cpu.conf'] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] ================================================================================ {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] allow_resize_to_same_host = True {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] arq_binding_timeout = 300 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] backdoor_port = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] backdoor_socket = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] block_device_allocate_retries = 300 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] block_device_allocate_retries_interval = 5 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cert = self.pem {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] compute_driver = libvirt.LibvirtDriver {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] compute_monitors = [] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] config_dir = [] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] config_drive_format = iso9660 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] config_file = ['/etc/nova/nova-cpu.conf'] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] config_source = [] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] console_host = user {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] control_exchange = nova {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cpu_allocation_ratio = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] daemon = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] debug = True {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] default_access_ip_network_name = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] default_availability_zone = nova {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] default_ephemeral_format = ext4 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] default_schedule_zone = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] disk_allocation_ratio = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] enable_new_services = True {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] enabled_apis = ['osapi_compute'] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] enabled_ssl_apis = [] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] flat_injected = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] force_config_drive = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] force_raw_images = True {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] graceful_shutdown_timeout = 5 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] heal_instance_info_cache_interval = 60 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] host = user {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] initial_cpu_allocation_ratio = 4.0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] initial_disk_allocation_ratio = 1.0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] initial_ram_allocation_ratio = 1.0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] instance_build_timeout = 0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] instance_delete_interval = 300 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] instance_format = [instance: %(uuid)s] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] instance_name_template = instance-%08x {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] instance_usage_audit = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] instance_usage_audit_period = month {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] instances_path = /opt/stack/data/nova/instances {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] internal_service_availability_zone = internal {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] key = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] live_migration_retry_count = 30 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] log_config_append = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] log_dir = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] log_file = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] log_options = True {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] log_rotate_interval = 1 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] log_rotate_interval_type = days {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] log_rotation_type = none {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] long_rpc_timeout = 1800 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] max_concurrent_builds = 10 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] max_concurrent_live_migrations = 1 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] max_concurrent_snapshots = 5 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] max_local_block_devices = 3 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] max_logfile_count = 30 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] max_logfile_size_mb = 200 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] maximum_instance_delete_attempts = 5 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] metadata_listen = 0.0.0.0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] metadata_listen_port = 8775 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] metadata_workers = 3 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] migrate_max_retries = -1 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] mkisofs_cmd = genisoimage {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] my_block_storage_ip = 10.0.0.210 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] my_ip = 10.0.0.210 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] network_allocate_retries = 0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] osapi_compute_listen = 0.0.0.0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] osapi_compute_listen_port = 8774 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] osapi_compute_unique_server_name_scope = {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] osapi_compute_workers = 3 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] password_length = 12 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] periodic_enable = True {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] periodic_fuzzy_delay = 60 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] pointer_model = ps2mouse {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] preallocate_images = none {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] publish_errors = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] pybasedir = /opt/stack/nova {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] ram_allocation_ratio = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] rate_limit_burst = 0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] rate_limit_except_level = CRITICAL {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] rate_limit_interval = 0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] reboot_timeout = 0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] reclaim_instance_interval = 0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] record = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] reimage_timeout_per_gb = 60 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] report_interval = 10 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] rescue_timeout = 0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] reserved_host_cpus = 0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] reserved_host_disk_mb = 0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] reserved_host_memory_mb = 512 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] reserved_huge_pages = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] resize_confirm_window = 0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] resize_fs_using_block_device = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] resume_guests_state_on_host_boot = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] rpc_response_timeout = 60 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] run_external_periodic_tasks = True {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] running_deleted_instance_action = reap {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] running_deleted_instance_poll_interval = 1800 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] running_deleted_instance_timeout = 0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] scheduler_instance_sync_interval = 120 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] service_down_time = 60 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] servicegroup_driver = db {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] shelved_offload_time = 0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] shelved_poll_interval = 3600 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] shutdown_timeout = 0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] source_is_ipv6 = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] ssl_only = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] state_path = /opt/stack/data/nova {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] sync_power_state_interval = 600 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] sync_power_state_pool_size = 1000 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] syslog_log_facility = LOG_USER {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] tempdir = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] timeout_nbd = 10 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] transport_url = **** {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] update_resources_interval = 0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] use_cow_images = True {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] use_eventlog = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] use_journal = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] use_json = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] use_rootwrap_daemon = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] use_stderr = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] use_syslog = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vcpu_pin_set = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vif_plugging_is_fatal = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vif_plugging_timeout = 0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] virt_mkfs = [] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] volume_usage_poll_interval = 0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] watch_log_file = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] web = /usr/share/spice-html5 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_concurrency.disable_process_locking = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_concurrency.lock_path = /opt/stack/data/nova {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_metrics.metrics_process_name = {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api.auth_strategy = keystone {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api.compute_link_prefix = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api.dhcp_domain = novalocal {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api.enable_instance_password = True {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api.glance_link_prefix = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api.instance_list_cells_batch_strategy = distributed {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api.instance_list_per_project_cells = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api.list_records_by_skipping_down_cells = True {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api.local_metadata_per_cell = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api.max_limit = 1000 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api.metadata_cache_expiration = 15 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api.neutron_default_tenant_id = default {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api.use_forwarded_for = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api.use_neutron_default_nets = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api.vendordata_dynamic_failure_fatal = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api.vendordata_dynamic_ssl_certfile = {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api.vendordata_dynamic_targets = [] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api.vendordata_jsonfile_path = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api.vendordata_providers = ['StaticJSON'] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cache.backend = dogpile.cache.memcached {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cache.backend_argument = **** {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cache.config_prefix = cache.oslo {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cache.dead_timeout = 60.0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cache.debug_cache_backend = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cache.enable_retry_client = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cache.enable_socket_keepalive = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cache.enabled = True {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cache.expiration_time = 600 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cache.hashclient_retry_attempts = 2 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cache.hashclient_retry_delay = 1.0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cache.memcache_dead_retry = 300 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cache.memcache_password = {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cache.memcache_pool_maxsize = 10 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cache.memcache_pool_unused_timeout = 60 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cache.memcache_sasl_enabled = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cache.memcache_servers = ['localhost:11211'] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cache.memcache_socket_timeout = 1.0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cache.memcache_username = {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cache.proxies = [] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cache.retry_attempts = 2 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cache.retry_delay = 0.0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cache.socket_keepalive_count = 1 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cache.socket_keepalive_idle = 1 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cache.socket_keepalive_interval = 1 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cache.tls_allowed_ciphers = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cache.tls_cafile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cache.tls_certfile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cache.tls_enabled = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cache.tls_keyfile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cinder.auth_section = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cinder.auth_type = password {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cinder.cafile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cinder.catalog_info = volumev3::publicURL {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cinder.certfile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cinder.collect_timing = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cinder.cross_az_attach = True {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cinder.debug = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cinder.endpoint_template = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cinder.http_retries = 3 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cinder.insecure = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cinder.keyfile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cinder.os_region_name = RegionOne {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cinder.split_loggers = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cinder.timeout = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] compute.cpu_dedicated_set = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] compute.cpu_shared_set = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] compute.image_type_exclude_list = [] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] compute.live_migration_wait_for_vif_plug = True {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] compute.max_concurrent_disk_ops = 0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] compute.max_disk_devices_to_attach = -1 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] compute.resource_provider_association_refresh = 300 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] compute.shutdown_retry_interval = 10 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] conductor.workers = 3 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] console.allowed_origins = [] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] console.ssl_ciphers = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] console.ssl_minimum_version = default {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] consoleauth.token_ttl = 600 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cyborg.cafile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cyborg.certfile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cyborg.collect_timing = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cyborg.connect_retries = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cyborg.connect_retry_delay = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cyborg.endpoint_override = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cyborg.insecure = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cyborg.keyfile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cyborg.max_version = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cyborg.min_version = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cyborg.region_name = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cyborg.service_name = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cyborg.service_type = accelerator {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cyborg.split_loggers = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cyborg.status_code_retries = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cyborg.status_code_retry_delay = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cyborg.timeout = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] cyborg.version = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] database.backend = sqlalchemy {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] database.connection = **** {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] database.connection_debug = 0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] database.connection_parameters = {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] database.connection_recycle_time = 3600 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] database.connection_trace = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] database.db_inc_retry_interval = True {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] database.db_max_retries = 20 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] database.db_max_retry_interval = 10 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] database.db_retry_interval = 1 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] database.max_overflow = 50 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] database.max_pool_size = 5 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] database.max_retries = 10 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] database.mysql_enable_ndb = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] database.mysql_sql_mode = TRADITIONAL {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] database.mysql_wsrep_sync_wait = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] database.pool_timeout = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] database.retry_interval = 10 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] database.slave_connection = **** {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] database.sqlite_synchronous = True {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api_database.backend = sqlalchemy {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api_database.connection = **** {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api_database.connection_debug = 0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api_database.connection_parameters = {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api_database.connection_recycle_time = 3600 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api_database.connection_trace = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api_database.db_inc_retry_interval = True {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api_database.db_max_retries = 20 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api_database.db_max_retry_interval = 10 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api_database.db_retry_interval = 1 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api_database.max_overflow = 50 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api_database.max_pool_size = 5 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api_database.max_retries = 10 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api_database.mysql_enable_ndb = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api_database.mysql_wsrep_sync_wait = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api_database.pool_timeout = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api_database.retry_interval = 10 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api_database.slave_connection = **** {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] api_database.sqlite_synchronous = True {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] devices.enabled_mdev_types = [] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] ephemeral_storage_encryption.enabled = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] ephemeral_storage_encryption.key_size = 512 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] glance.api_servers = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] glance.cafile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] glance.certfile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] glance.collect_timing = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] glance.connect_retries = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] glance.connect_retry_delay = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] glance.debug = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] glance.default_trusted_certificate_ids = [] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] glance.enable_certificate_validation = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] glance.enable_rbd_download = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] glance.endpoint_override = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] glance.insecure = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] glance.keyfile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] glance.max_version = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] glance.min_version = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] glance.num_retries = 3 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] glance.rbd_ceph_conf = {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] glance.rbd_connect_timeout = 5 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] glance.rbd_pool = {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] glance.rbd_user = {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] glance.region_name = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] glance.service_name = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] glance.service_type = image {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] glance.split_loggers = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] glance.status_code_retries = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] glance.status_code_retry_delay = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] glance.timeout = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] glance.verify_glance_signatures = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] glance.version = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] guestfs.debug = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] hyperv.config_drive_cdrom = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] hyperv.config_drive_inject_password = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] hyperv.enable_instance_metrics_collection = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] hyperv.enable_remotefx = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] hyperv.instances_path_share = {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] hyperv.iscsi_initiator_list = [] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] hyperv.limit_cpu_features = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] hyperv.power_state_check_timeframe = 60 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] hyperv.power_state_event_polling_interval = 2 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] hyperv.use_multipath_io = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] hyperv.volume_attach_retry_count = 10 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] hyperv.volume_attach_retry_interval = 5 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] hyperv.vswitch_name = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] mks.enabled = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] image_cache.manager_interval = 2400 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] image_cache.precache_concurrency = 1 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] image_cache.remove_unused_base_images = True {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] image_cache.subdirectory_name = _base {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] ironic.api_max_retries = 60 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] ironic.api_retry_interval = 2 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] ironic.auth_section = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] ironic.auth_type = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] ironic.cafile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] ironic.certfile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] ironic.collect_timing = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] ironic.connect_retries = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] ironic.connect_retry_delay = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] ironic.endpoint_override = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] ironic.insecure = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] ironic.keyfile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] ironic.max_version = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] ironic.min_version = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] ironic.partition_key = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] ironic.peer_list = [] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] ironic.region_name = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] ironic.serial_console_state_timeout = 10 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] ironic.service_name = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] ironic.service_type = baremetal {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] ironic.split_loggers = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] ironic.status_code_retries = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] ironic.status_code_retry_delay = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] ironic.timeout = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] ironic.version = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] key_manager.fixed_key = **** {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] barbican.barbican_api_version = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] barbican.barbican_endpoint = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] barbican.barbican_endpoint_type = public {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] barbican.barbican_region_name = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] barbican.cafile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] barbican.certfile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] barbican.collect_timing = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] barbican.insecure = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] barbican.keyfile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] barbican.number_of_retries = 60 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] barbican.retry_delay = 1 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] barbican.send_service_user_token = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] barbican.split_loggers = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] barbican.timeout = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] barbican.verify_ssl = True {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] barbican.verify_ssl_path = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] barbican_service_user.auth_section = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] barbican_service_user.auth_type = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] barbican_service_user.cafile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] barbican_service_user.certfile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] barbican_service_user.collect_timing = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] barbican_service_user.insecure = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] barbican_service_user.keyfile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] barbican_service_user.split_loggers = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] barbican_service_user.timeout = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vault.approle_role_id = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vault.approle_secret_id = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vault.cafile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vault.certfile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vault.collect_timing = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vault.insecure = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vault.keyfile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vault.kv_mountpoint = secret {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vault.kv_version = 2 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vault.namespace = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vault.root_token_id = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vault.split_loggers = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vault.ssl_ca_crt_file = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vault.timeout = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vault.use_ssl = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] keystone.cafile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] keystone.certfile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] keystone.collect_timing = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] keystone.connect_retries = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] keystone.connect_retry_delay = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] keystone.endpoint_override = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] keystone.insecure = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] keystone.keyfile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] keystone.max_version = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] keystone.min_version = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] keystone.region_name = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] keystone.service_name = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] keystone.service_type = identity {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] keystone.split_loggers = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] keystone.status_code_retries = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] keystone.status_code_retry_delay = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] keystone.timeout = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] keystone.version = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.connection_uri = {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.cpu_mode = custom {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.cpu_model_extra_flags = [] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: WARNING oslo_config.cfg [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] Deprecated: Option "cpu_model" from group "libvirt" is deprecated. Use option "cpu_models" from group "libvirt". Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.cpu_models = ['Nehalem'] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.cpu_power_governor_high = performance {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.cpu_power_governor_low = powersave {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.cpu_power_management = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.device_detach_attempts = 8 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.device_detach_timeout = 20 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.disk_cachemodes = [] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.disk_prefix = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.enabled_perf_events = [] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.file_backed_memory = 0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.gid_maps = [] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.hw_disk_discard = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.hw_machine_type = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.images_rbd_ceph_conf = {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.images_rbd_glance_store_name = {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.images_rbd_pool = rbd {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.images_type = default {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.images_volume_group = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.inject_key = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.inject_partition = -2 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.inject_password = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.iscsi_iface = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.iser_use_multipath = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.live_migration_bandwidth = 0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.live_migration_completion_timeout = 800 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.live_migration_downtime = 500 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.live_migration_downtime_delay = 75 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.live_migration_downtime_steps = 10 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.live_migration_inbound_addr = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.live_migration_permit_auto_converge = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.live_migration_permit_post_copy = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.live_migration_scheme = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.live_migration_timeout_action = abort {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.live_migration_tunnelled = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: WARNING oslo_config.cfg [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Jul 27 09:25:59 user nova-compute[70374]: live_migration_uri is deprecated for removal in favor of two other options that Jul 27 09:25:59 user nova-compute[70374]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Jul 27 09:25:59 user nova-compute[70374]: and ``live_migration_inbound_addr`` respectively. Jul 27 09:25:59 user nova-compute[70374]: ). Its value may be silently ignored in the future. Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.live_migration_uri = qemu+ssh://stack@%s/system {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.live_migration_with_native_tls = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.max_queues = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.mem_stats_period_seconds = 10 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.nfs_mount_options = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.nfs_mount_point_base = /opt/stack/data/nova/mnt {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.num_aoe_discover_tries = 3 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.num_iser_scan_tries = 5 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.num_memory_encrypted_guests = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.num_nvme_discover_tries = 5 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.num_pcie_ports = 0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.num_volume_scan_tries = 5 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.pmem_namespaces = [] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.quobyte_client_cfg = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.quobyte_mount_point_base = /opt/stack/data/nova/mnt {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.rbd_connect_timeout = 5 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.rbd_secret_uuid = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.rbd_user = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.realtime_scheduler_priority = 1 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.remote_filesystem_transport = ssh {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.rescue_image_id = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.rescue_kernel_id = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.rescue_ramdisk_id = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.rng_dev_path = /dev/urandom {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.rx_queue_size = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.smbfs_mount_options = {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.smbfs_mount_point_base = /opt/stack/data/nova/mnt {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.snapshot_compression = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.snapshot_image_format = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.sparse_logical_volumes = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.swtpm_enabled = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.swtpm_group = tss {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.swtpm_user = tss {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.sysinfo_serial = unique {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.tb_cache_size = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.tx_queue_size = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.uid_maps = [] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.use_virtio_for_bridges = True {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.virt_type = kvm {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.volume_clear = zero {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.volume_clear_size = 0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.volume_use_multipath = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.vzstorage_cache_path = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.vzstorage_mount_group = qemu {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.vzstorage_mount_opts = [] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/nova/mnt {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.vzstorage_mount_user = stack {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] neutron.auth_section = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] neutron.auth_type = password {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] neutron.cafile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] neutron.certfile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] neutron.collect_timing = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] neutron.connect_retries = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] neutron.connect_retry_delay = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] neutron.default_floating_pool = public {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] neutron.endpoint_override = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] neutron.extension_sync_interval = 600 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] neutron.http_retries = 3 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] neutron.insecure = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] neutron.keyfile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] neutron.max_version = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] neutron.metadata_proxy_shared_secret = **** {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] neutron.min_version = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] neutron.ovs_bridge = br-int {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] neutron.physnets = [] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] neutron.region_name = RegionOne {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] neutron.service_metadata_proxy = True {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] neutron.service_name = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] neutron.service_type = network {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] neutron.split_loggers = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] neutron.status_code_retries = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] neutron.status_code_retry_delay = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] neutron.timeout = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] neutron.version = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] notifications.bdms_in_notifications = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] notifications.default_level = INFO {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] notifications.notification_format = unversioned {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] notifications.notify_on_state_change = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] pci.alias = [] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] pci.device_spec = [] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] pci.report_in_placement = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] placement.auth_section = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] placement.auth_type = password {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] placement.auth_url = http://10.0.0.210/identity {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] placement.cafile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] placement.certfile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] placement.collect_timing = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] placement.connect_retries = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] placement.connect_retry_delay = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] placement.default_domain_id = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] placement.default_domain_name = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] placement.domain_id = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] placement.domain_name = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] placement.endpoint_override = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] placement.insecure = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] placement.keyfile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] placement.max_version = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] placement.min_version = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] placement.password = **** {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] placement.project_domain_id = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] placement.project_domain_name = Default {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] placement.project_id = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] placement.project_name = service {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] placement.region_name = RegionOne {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] placement.service_name = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] placement.service_type = placement {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] placement.split_loggers = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] placement.status_code_retries = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] placement.status_code_retry_delay = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] placement.system_scope = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] placement.timeout = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] placement.trust_id = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] placement.user_domain_id = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] placement.user_domain_name = Default {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] placement.user_id = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] placement.username = placement {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] placement.version = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] quota.cores = 20 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] quota.count_usage_from_placement = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] quota.injected_file_content_bytes = 10240 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] quota.injected_file_path_length = 255 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] quota.injected_files = 5 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] quota.instances = 10 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] quota.key_pairs = 100 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] quota.metadata_items = 128 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] quota.ram = 51200 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] quota.recheck_quota = True {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] quota.server_group_members = 10 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] quota.server_groups = 10 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] rdp.enabled = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] scheduler.image_metadata_prefilter = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] scheduler.max_attempts = 3 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] scheduler.max_placement_results = 1000 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] scheduler.query_placement_for_availability_zone = True {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] scheduler.query_placement_for_image_type_support = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] scheduler.workers = 3 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] filter_scheduler.host_subset_size = 1 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] filter_scheduler.image_properties_default_architecture = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] filter_scheduler.isolated_hosts = [] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] filter_scheduler.isolated_images = [] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] filter_scheduler.max_instances_per_host = 50 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] filter_scheduler.pci_in_placement = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] filter_scheduler.track_instance_changes = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] metrics.required = True {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] metrics.weight_multiplier = 1.0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] metrics.weight_of_unavailable = -10000.0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] metrics.weight_setting = [] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] serial_console.enabled = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] serial_console.port_range = 10000:20000 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] serial_console.serialproxy_port = 6083 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] service_user.auth_section = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] service_user.auth_type = password {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] service_user.cafile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] service_user.certfile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] service_user.collect_timing = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] service_user.insecure = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] service_user.keyfile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] service_user.send_service_user_token = True {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] service_user.split_loggers = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] service_user.timeout = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] spice.agent_enabled = True {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] spice.enabled = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] spice.html5proxy_base_url = http://10.0.0.210:6081/spice_auto.html {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] spice.html5proxy_host = 0.0.0.0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] spice.html5proxy_port = 6082 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] spice.image_compression = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] spice.jpeg_compression = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] spice.playback_compression = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] spice.server_listen = 127.0.0.1 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] spice.streaming_mode = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] spice.zlib_compression = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] upgrade_levels.baseapi = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] upgrade_levels.cert = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] upgrade_levels.compute = auto {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] upgrade_levels.conductor = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] upgrade_levels.scheduler = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vendordata_dynamic_auth.auth_section = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vendordata_dynamic_auth.auth_type = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vendordata_dynamic_auth.cafile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vendordata_dynamic_auth.certfile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vendordata_dynamic_auth.collect_timing = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vendordata_dynamic_auth.insecure = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vendordata_dynamic_auth.keyfile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vendordata_dynamic_auth.split_loggers = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vendordata_dynamic_auth.timeout = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vmware.api_retry_count = 10 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vmware.ca_file = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vmware.cache_prefix = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vmware.cluster_name = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vmware.connection_pool_size = 10 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vmware.console_delay_seconds = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vmware.datastore_regex = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vmware.host_ip = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vmware.host_password = **** {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vmware.host_port = 443 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vmware.host_username = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vmware.insecure = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vmware.integration_bridge = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vmware.maximum_objects = 100 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vmware.pbm_default_policy = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vmware.pbm_enabled = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vmware.pbm_wsdl_location = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vmware.serial_port_proxy_uri = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vmware.serial_port_service_uri = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vmware.task_poll_interval = 0.5 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vmware.use_linked_clone = True {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vmware.vnc_keymap = en-us {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vmware.vnc_port = 5900 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vmware.vnc_port_total = 10000 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vnc.auth_schemes = ['none'] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vnc.enabled = True {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vnc.novncproxy_base_url = http://10.0.0.210:6080/vnc_lite.html {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vnc.novncproxy_port = 6080 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vnc.server_listen = 0.0.0.0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vnc.server_proxyclient_address = 10.0.0.210 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vnc.vencrypt_ca_certs = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vnc.vencrypt_client_cert = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vnc.vencrypt_client_key = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] workarounds.disable_fallback_pcpu_query = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] workarounds.disable_group_policy_check_upcall = True {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] workarounds.disable_rootwrap = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] workarounds.enable_numa_live_migration = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] workarounds.handle_virt_lifecycle_events = True {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] workarounds.libvirt_disable_apic = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] workarounds.never_download_image_if_on_rbd = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] wsgi.client_socket_timeout = 900 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] wsgi.default_pool_size = 1000 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] wsgi.keep_alive = True {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] wsgi.max_header_line = 16384 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] wsgi.secure_proxy_ssl_header = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] wsgi.ssl_ca_file = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] wsgi.ssl_cert_file = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] wsgi.ssl_key_file = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] wsgi.tcp_keepidle = 600 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] zvm.ca_file = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] zvm.cloud_connector_url = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] zvm.image_tmp_path = /opt/stack/data/nova/images {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] zvm.reachable_timeout = 300 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_policy.enforce_new_defaults = True {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_policy.enforce_scope = True {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_policy.policy_default_rule = default {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_policy.policy_file = policy.yaml {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] profiler.connection_string = messaging:// {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] profiler.enabled = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] profiler.es_doc_type = notification {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] profiler.es_scroll_size = 10000 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] profiler.es_scroll_time = 2m {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] profiler.filter_error_trace = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] profiler.hmac_keys = **** {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] profiler.sentinel_service_name = mymaster {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] profiler.socket_timeout = 0.1 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] profiler.trace_requests = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] profiler.trace_sqlalchemy = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] profiler_jaeger.process_tags = {} {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] profiler_jaeger.service_name_prefix = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] profiler_otlp.service_name_prefix = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] remote_debug.host = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] remote_debug.port = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_rabbit.ssl = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_rabbit.ssl_version = {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_notifications.retry = -1 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_messaging_notifications.transport_url = **** {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_limit.auth_section = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_limit.auth_type = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_limit.cafile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_limit.certfile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_limit.collect_timing = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_limit.connect_retries = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_limit.connect_retry_delay = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_limit.endpoint_id = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_limit.endpoint_override = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_limit.insecure = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_limit.keyfile = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_limit.max_version = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_limit.min_version = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_limit.region_name = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_limit.service_name = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_limit.service_type = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_limit.split_loggers = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_limit.status_code_retries = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_limit.status_code_retry_delay = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_limit.timeout = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_limit.valid_interfaces = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_limit.version = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_reports.file_event_handler = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_reports.file_event_handler_interval = 1 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] oslo_reports.log_dir = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vif_plug_linux_bridge_privileged.group = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vif_plug_linux_bridge_privileged.thread_pool_size = 12 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vif_plug_linux_bridge_privileged.user = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vif_plug_ovs_privileged.group = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vif_plug_ovs_privileged.helper_command = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vif_plug_ovs_privileged.thread_pool_size = 12 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] vif_plug_ovs_privileged.user = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] os_vif_linux_bridge.flat_interface = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] os_vif_linux_bridge.vlan_interface = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] os_vif_ovs.isolate_vif = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] os_vif_ovs.ovsdb_interface = native {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] os_vif_ovs.per_port_bridge = False {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] os_brick.lock_path = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] privsep_osbrick.capabilities = [21] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] privsep_osbrick.group = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] privsep_osbrick.helper_command = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] privsep_osbrick.thread_pool_size = 12 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] privsep_osbrick.user = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] nova_sys_admin.group = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] nova_sys_admin.helper_command = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] nova_sys_admin.thread_pool_size = 12 {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] nova_sys_admin.user = None {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG oslo_service.service [None req-ac66df63-fac3-4a0b-9c91-837479754330 None None] ******************************************************************************** {{(pid=70374) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} Jul 27 09:25:59 user nova-compute[70374]: INFO nova.service [-] Starting compute node (version 0.0.0) Jul 27 09:25:59 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Starting native event thread {{(pid=70374) _init_events /opt/stack/nova/nova/virt/libvirt/host.py:492}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Starting green dispatch thread {{(pid=70374) _init_events /opt/stack/nova/nova/virt/libvirt/host.py:498}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Starting connection event dispatch thread {{(pid=70374) initialize /opt/stack/nova/nova/virt/libvirt/host.py:620}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Connecting to libvirt: qemu:///system {{(pid=70374) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:503}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Registering for lifecycle events {{(pid=70374) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:509}} Jul 27 09:25:59 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Registering for connection events: {{(pid=70374) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:530}} Jul 27 09:25:59 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Connection event '1' reason 'None' Jul 27 09:25:59 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Cannot update service status on host "user" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host user could not be found. Jul 27 09:25:59 user nova-compute[70374]: DEBUG nova.virt.libvirt.volume.mount [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Initialising _HostMountState generation 0 {{(pid=70374) host_up /opt/stack/nova/nova/virt/libvirt/volume/mount.py:130}} Jul 27 09:26:07 user nova-compute[70374]: INFO nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Libvirt host capabilities Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: e20c3142-5af9-7467-ecd8-70b2e4a210d6 Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: x86_64 Jul 27 09:26:07 user nova-compute[70374]: IvyBridge-IBRS Jul 27 09:26:07 user nova-compute[70374]: Intel Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: tcp Jul 27 09:26:07 user nova-compute[70374]: rdma Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: 8140616 Jul 27 09:26:07 user nova-compute[70374]: 2035154 Jul 27 09:26:07 user nova-compute[70374]: 0 Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: 8255100 Jul 27 09:26:07 user nova-compute[70374]: 2063775 Jul 27 09:26:07 user nova-compute[70374]: 0 Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: apparmor Jul 27 09:26:07 user nova-compute[70374]: 0 Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: dac Jul 27 09:26:07 user nova-compute[70374]: 0 Jul 27 09:26:07 user nova-compute[70374]: +64055:+108 Jul 27 09:26:07 user nova-compute[70374]: +64055:+108 Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: hvm Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: 64 Jul 27 09:26:07 user nova-compute[70374]: /usr/bin/qemu-system-alpha Jul 27 09:26:07 user nova-compute[70374]: clipper Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: hvm Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: 32 Jul 27 09:26:07 user nova-compute[70374]: /usr/bin/qemu-system-arm Jul 27 09:26:07 user nova-compute[70374]: integratorcp Jul 27 09:26:07 user nova-compute[70374]: ast2600-evb Jul 27 09:26:07 user nova-compute[70374]: borzoi Jul 27 09:26:07 user nova-compute[70374]: spitz Jul 27 09:26:07 user nova-compute[70374]: virt-2.7 Jul 27 09:26:07 user nova-compute[70374]: nuri Jul 27 09:26:07 user nova-compute[70374]: mcimx7d-sabre Jul 27 09:26:07 user nova-compute[70374]: romulus-bmc Jul 27 09:26:07 user nova-compute[70374]: virt-3.0 Jul 27 09:26:07 user nova-compute[70374]: virt-5.0 Jul 27 09:26:07 user nova-compute[70374]: npcm750-evb Jul 27 09:26:07 user nova-compute[70374]: virt-2.10 Jul 27 09:26:07 user nova-compute[70374]: rainier-bmc Jul 27 09:26:07 user nova-compute[70374]: mps3-an547 Jul 27 09:26:07 user nova-compute[70374]: musca-b1 Jul 27 09:26:07 user nova-compute[70374]: realview-pbx-a9 Jul 27 09:26:07 user nova-compute[70374]: versatileab Jul 27 09:26:07 user nova-compute[70374]: kzm Jul 27 09:26:07 user nova-compute[70374]: virt-2.8 Jul 27 09:26:07 user nova-compute[70374]: musca-a Jul 27 09:26:07 user nova-compute[70374]: virt-3.1 Jul 27 09:26:07 user nova-compute[70374]: mcimx6ul-evk Jul 27 09:26:07 user nova-compute[70374]: virt-5.1 Jul 27 09:26:07 user nova-compute[70374]: smdkc210 Jul 27 09:26:07 user nova-compute[70374]: sx1 Jul 27 09:26:07 user nova-compute[70374]: virt-2.11 Jul 27 09:26:07 user nova-compute[70374]: imx25-pdk Jul 27 09:26:07 user nova-compute[70374]: stm32vldiscovery Jul 27 09:26:07 user nova-compute[70374]: virt-2.9 Jul 27 09:26:07 user nova-compute[70374]: orangepi-pc Jul 27 09:26:07 user nova-compute[70374]: quanta-q71l-bmc Jul 27 09:26:07 user nova-compute[70374]: z2 Jul 27 09:26:07 user nova-compute[70374]: virt-5.2 Jul 27 09:26:07 user nova-compute[70374]: xilinx-zynq-a9 Jul 27 09:26:07 user nova-compute[70374]: tosa Jul 27 09:26:07 user nova-compute[70374]: mps2-an500 Jul 27 09:26:07 user nova-compute[70374]: virt-2.12 Jul 27 09:26:07 user nova-compute[70374]: mps2-an521 Jul 27 09:26:07 user nova-compute[70374]: sabrelite Jul 27 09:26:07 user nova-compute[70374]: mps2-an511 Jul 27 09:26:07 user nova-compute[70374]: canon-a1100 Jul 27 09:26:07 user nova-compute[70374]: realview-eb Jul 27 09:26:07 user nova-compute[70374]: quanta-gbs-bmc Jul 27 09:26:07 user nova-compute[70374]: emcraft-sf2 Jul 27 09:26:07 user nova-compute[70374]: realview-pb-a8 Jul 27 09:26:07 user nova-compute[70374]: virt-4.0 Jul 27 09:26:07 user nova-compute[70374]: raspi1ap Jul 27 09:26:07 user nova-compute[70374]: palmetto-bmc Jul 27 09:26:07 user nova-compute[70374]: sx1-v1 Jul 27 09:26:07 user nova-compute[70374]: n810 Jul 27 09:26:07 user nova-compute[70374]: g220a-bmc Jul 27 09:26:07 user nova-compute[70374]: n800 Jul 27 09:26:07 user nova-compute[70374]: tacoma-bmc Jul 27 09:26:07 user nova-compute[70374]: virt-4.1 Jul 27 09:26:07 user nova-compute[70374]: quanta-gsj Jul 27 09:26:07 user nova-compute[70374]: versatilepb Jul 27 09:26:07 user nova-compute[70374]: terrier Jul 27 09:26:07 user nova-compute[70374]: mainstone Jul 27 09:26:07 user nova-compute[70374]: realview-eb-mpcore Jul 27 09:26:07 user nova-compute[70374]: supermicrox11-bmc Jul 27 09:26:07 user nova-compute[70374]: virt-4.2 Jul 27 09:26:07 user nova-compute[70374]: witherspoon-bmc Jul 27 09:26:07 user nova-compute[70374]: mps3-an524 Jul 27 09:26:07 user nova-compute[70374]: swift-bmc Jul 27 09:26:07 user nova-compute[70374]: kudo-bmc Jul 27 09:26:07 user nova-compute[70374]: vexpress-a9 Jul 27 09:26:07 user nova-compute[70374]: midway Jul 27 09:26:07 user nova-compute[70374]: musicpal Jul 27 09:26:07 user nova-compute[70374]: lm3s811evb Jul 27 09:26:07 user nova-compute[70374]: lm3s6965evb Jul 27 09:26:07 user nova-compute[70374]: microbit Jul 27 09:26:07 user nova-compute[70374]: mps2-an505 Jul 27 09:26:07 user nova-compute[70374]: mps2-an385 Jul 27 09:26:07 user nova-compute[70374]: virt-6.0 Jul 27 09:26:07 user nova-compute[70374]: cubieboard Jul 27 09:26:07 user nova-compute[70374]: verdex Jul 27 09:26:07 user nova-compute[70374]: netduino2 Jul 27 09:26:07 user nova-compute[70374]: mps2-an386 Jul 27 09:26:07 user nova-compute[70374]: virt-6.1 Jul 27 09:26:07 user nova-compute[70374]: raspi2b Jul 27 09:26:07 user nova-compute[70374]: vexpress-a15 Jul 27 09:26:07 user nova-compute[70374]: fuji-bmc Jul 27 09:26:07 user nova-compute[70374]: virt-6.2 Jul 27 09:26:07 user nova-compute[70374]: virt Jul 27 09:26:07 user nova-compute[70374]: sonorapass-bmc Jul 27 09:26:07 user nova-compute[70374]: cheetah Jul 27 09:26:07 user nova-compute[70374]: virt-2.6 Jul 27 09:26:07 user nova-compute[70374]: ast2500-evb Jul 27 09:26:07 user nova-compute[70374]: highbank Jul 27 09:26:07 user nova-compute[70374]: akita Jul 27 09:26:07 user nova-compute[70374]: connex Jul 27 09:26:07 user nova-compute[70374]: netduinoplus2 Jul 27 09:26:07 user nova-compute[70374]: collie Jul 27 09:26:07 user nova-compute[70374]: raspi0 Jul 27 09:26:07 user nova-compute[70374]: fp5280g2-bmc Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: hvm Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: 32 Jul 27 09:26:07 user nova-compute[70374]: /usr/bin/qemu-system-arm Jul 27 09:26:07 user nova-compute[70374]: integratorcp Jul 27 09:26:07 user nova-compute[70374]: ast2600-evb Jul 27 09:26:07 user nova-compute[70374]: borzoi Jul 27 09:26:07 user nova-compute[70374]: spitz Jul 27 09:26:07 user nova-compute[70374]: virt-2.7 Jul 27 09:26:07 user nova-compute[70374]: nuri Jul 27 09:26:07 user nova-compute[70374]: mcimx7d-sabre Jul 27 09:26:07 user nova-compute[70374]: romulus-bmc Jul 27 09:26:07 user nova-compute[70374]: virt-3.0 Jul 27 09:26:07 user nova-compute[70374]: virt-5.0 Jul 27 09:26:07 user nova-compute[70374]: npcm750-evb Jul 27 09:26:07 user nova-compute[70374]: virt-2.10 Jul 27 09:26:07 user nova-compute[70374]: rainier-bmc Jul 27 09:26:07 user nova-compute[70374]: mps3-an547 Jul 27 09:26:07 user nova-compute[70374]: musca-b1 Jul 27 09:26:07 user nova-compute[70374]: realview-pbx-a9 Jul 27 09:26:07 user nova-compute[70374]: versatileab Jul 27 09:26:07 user nova-compute[70374]: kzm Jul 27 09:26:07 user nova-compute[70374]: virt-2.8 Jul 27 09:26:07 user nova-compute[70374]: musca-a Jul 27 09:26:07 user nova-compute[70374]: virt-3.1 Jul 27 09:26:07 user nova-compute[70374]: mcimx6ul-evk Jul 27 09:26:07 user nova-compute[70374]: virt-5.1 Jul 27 09:26:07 user nova-compute[70374]: smdkc210 Jul 27 09:26:07 user nova-compute[70374]: sx1 Jul 27 09:26:07 user nova-compute[70374]: virt-2.11 Jul 27 09:26:07 user nova-compute[70374]: imx25-pdk Jul 27 09:26:07 user nova-compute[70374]: stm32vldiscovery Jul 27 09:26:07 user nova-compute[70374]: virt-2.9 Jul 27 09:26:07 user nova-compute[70374]: orangepi-pc Jul 27 09:26:07 user nova-compute[70374]: quanta-q71l-bmc Jul 27 09:26:07 user nova-compute[70374]: z2 Jul 27 09:26:07 user nova-compute[70374]: virt-5.2 Jul 27 09:26:07 user nova-compute[70374]: xilinx-zynq-a9 Jul 27 09:26:07 user nova-compute[70374]: tosa Jul 27 09:26:07 user nova-compute[70374]: mps2-an500 Jul 27 09:26:07 user nova-compute[70374]: virt-2.12 Jul 27 09:26:07 user nova-compute[70374]: mps2-an521 Jul 27 09:26:07 user nova-compute[70374]: sabrelite Jul 27 09:26:07 user nova-compute[70374]: mps2-an511 Jul 27 09:26:07 user nova-compute[70374]: canon-a1100 Jul 27 09:26:07 user nova-compute[70374]: realview-eb Jul 27 09:26:07 user nova-compute[70374]: quanta-gbs-bmc Jul 27 09:26:07 user nova-compute[70374]: emcraft-sf2 Jul 27 09:26:07 user nova-compute[70374]: realview-pb-a8 Jul 27 09:26:07 user nova-compute[70374]: virt-4.0 Jul 27 09:26:07 user nova-compute[70374]: raspi1ap Jul 27 09:26:07 user nova-compute[70374]: palmetto-bmc Jul 27 09:26:07 user nova-compute[70374]: sx1-v1 Jul 27 09:26:07 user nova-compute[70374]: n810 Jul 27 09:26:07 user nova-compute[70374]: g220a-bmc Jul 27 09:26:07 user nova-compute[70374]: n800 Jul 27 09:26:07 user nova-compute[70374]: tacoma-bmc Jul 27 09:26:07 user nova-compute[70374]: virt-4.1 Jul 27 09:26:07 user nova-compute[70374]: quanta-gsj Jul 27 09:26:07 user nova-compute[70374]: versatilepb Jul 27 09:26:07 user nova-compute[70374]: terrier Jul 27 09:26:07 user nova-compute[70374]: mainstone Jul 27 09:26:07 user nova-compute[70374]: realview-eb-mpcore Jul 27 09:26:07 user nova-compute[70374]: supermicrox11-bmc Jul 27 09:26:07 user nova-compute[70374]: virt-4.2 Jul 27 09:26:07 user nova-compute[70374]: witherspoon-bmc Jul 27 09:26:07 user nova-compute[70374]: mps3-an524 Jul 27 09:26:07 user nova-compute[70374]: swift-bmc Jul 27 09:26:07 user nova-compute[70374]: kudo-bmc Jul 27 09:26:07 user nova-compute[70374]: vexpress-a9 Jul 27 09:26:07 user nova-compute[70374]: midway Jul 27 09:26:07 user nova-compute[70374]: musicpal Jul 27 09:26:07 user nova-compute[70374]: lm3s811evb Jul 27 09:26:07 user nova-compute[70374]: lm3s6965evb Jul 27 09:26:07 user nova-compute[70374]: microbit Jul 27 09:26:07 user nova-compute[70374]: mps2-an505 Jul 27 09:26:07 user nova-compute[70374]: mps2-an385 Jul 27 09:26:07 user nova-compute[70374]: virt-6.0 Jul 27 09:26:07 user nova-compute[70374]: cubieboard Jul 27 09:26:07 user nova-compute[70374]: verdex Jul 27 09:26:07 user nova-compute[70374]: netduino2 Jul 27 09:26:07 user nova-compute[70374]: mps2-an386 Jul 27 09:26:07 user nova-compute[70374]: virt-6.1 Jul 27 09:26:07 user nova-compute[70374]: raspi2b Jul 27 09:26:07 user nova-compute[70374]: vexpress-a15 Jul 27 09:26:07 user nova-compute[70374]: fuji-bmc Jul 27 09:26:07 user nova-compute[70374]: virt-6.2 Jul 27 09:26:07 user nova-compute[70374]: virt Jul 27 09:26:07 user nova-compute[70374]: sonorapass-bmc Jul 27 09:26:07 user nova-compute[70374]: cheetah Jul 27 09:26:07 user nova-compute[70374]: virt-2.6 Jul 27 09:26:07 user nova-compute[70374]: ast2500-evb Jul 27 09:26:07 user nova-compute[70374]: highbank Jul 27 09:26:07 user nova-compute[70374]: akita Jul 27 09:26:07 user nova-compute[70374]: connex Jul 27 09:26:07 user nova-compute[70374]: netduinoplus2 Jul 27 09:26:07 user nova-compute[70374]: collie Jul 27 09:26:07 user nova-compute[70374]: raspi0 Jul 27 09:26:07 user nova-compute[70374]: fp5280g2-bmc Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: hvm Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: 64 Jul 27 09:26:07 user nova-compute[70374]: /usr/bin/qemu-system-aarch64 Jul 27 09:26:07 user nova-compute[70374]: integratorcp Jul 27 09:26:07 user nova-compute[70374]: ast2600-evb Jul 27 09:26:07 user nova-compute[70374]: borzoi Jul 27 09:26:07 user nova-compute[70374]: spitz Jul 27 09:26:07 user nova-compute[70374]: virt-2.7 Jul 27 09:26:07 user nova-compute[70374]: nuri Jul 27 09:26:07 user nova-compute[70374]: mcimx7d-sabre Jul 27 09:26:07 user nova-compute[70374]: romulus-bmc Jul 27 09:26:07 user nova-compute[70374]: virt-3.0 Jul 27 09:26:07 user nova-compute[70374]: virt-5.0 Jul 27 09:26:07 user nova-compute[70374]: npcm750-evb Jul 27 09:26:07 user nova-compute[70374]: virt-2.10 Jul 27 09:26:07 user nova-compute[70374]: rainier-bmc Jul 27 09:26:07 user nova-compute[70374]: mps3-an547 Jul 27 09:26:07 user nova-compute[70374]: virt-2.8 Jul 27 09:26:07 user nova-compute[70374]: musca-b1 Jul 27 09:26:07 user nova-compute[70374]: realview-pbx-a9 Jul 27 09:26:07 user nova-compute[70374]: versatileab Jul 27 09:26:07 user nova-compute[70374]: kzm Jul 27 09:26:07 user nova-compute[70374]: musca-a Jul 27 09:26:07 user nova-compute[70374]: virt-3.1 Jul 27 09:26:07 user nova-compute[70374]: mcimx6ul-evk Jul 27 09:26:07 user nova-compute[70374]: virt-5.1 Jul 27 09:26:07 user nova-compute[70374]: smdkc210 Jul 27 09:26:07 user nova-compute[70374]: sx1 Jul 27 09:26:07 user nova-compute[70374]: virt-2.11 Jul 27 09:26:07 user nova-compute[70374]: imx25-pdk Jul 27 09:26:07 user nova-compute[70374]: stm32vldiscovery Jul 27 09:26:07 user nova-compute[70374]: virt-2.9 Jul 27 09:26:07 user nova-compute[70374]: orangepi-pc Jul 27 09:26:07 user nova-compute[70374]: quanta-q71l-bmc Jul 27 09:26:07 user nova-compute[70374]: z2 Jul 27 09:26:07 user nova-compute[70374]: virt-5.2 Jul 27 09:26:07 user nova-compute[70374]: xilinx-zynq-a9 Jul 27 09:26:07 user nova-compute[70374]: xlnx-zcu102 Jul 27 09:26:07 user nova-compute[70374]: tosa Jul 27 09:26:07 user nova-compute[70374]: mps2-an500 Jul 27 09:26:07 user nova-compute[70374]: virt-2.12 Jul 27 09:26:07 user nova-compute[70374]: mps2-an521 Jul 27 09:26:07 user nova-compute[70374]: sabrelite Jul 27 09:26:07 user nova-compute[70374]: mps2-an511 Jul 27 09:26:07 user nova-compute[70374]: canon-a1100 Jul 27 09:26:07 user nova-compute[70374]: realview-eb Jul 27 09:26:07 user nova-compute[70374]: quanta-gbs-bmc Jul 27 09:26:07 user nova-compute[70374]: emcraft-sf2 Jul 27 09:26:07 user nova-compute[70374]: realview-pb-a8 Jul 27 09:26:07 user nova-compute[70374]: sbsa-ref Jul 27 09:26:07 user nova-compute[70374]: virt-4.0 Jul 27 09:26:07 user nova-compute[70374]: raspi1ap Jul 27 09:26:07 user nova-compute[70374]: palmetto-bmc Jul 27 09:26:07 user nova-compute[70374]: sx1-v1 Jul 27 09:26:07 user nova-compute[70374]: n810 Jul 27 09:26:07 user nova-compute[70374]: g220a-bmc Jul 27 09:26:07 user nova-compute[70374]: n800 Jul 27 09:26:07 user nova-compute[70374]: tacoma-bmc Jul 27 09:26:07 user nova-compute[70374]: virt-4.1 Jul 27 09:26:07 user nova-compute[70374]: quanta-gsj Jul 27 09:26:07 user nova-compute[70374]: versatilepb Jul 27 09:26:07 user nova-compute[70374]: terrier Jul 27 09:26:07 user nova-compute[70374]: mainstone Jul 27 09:26:07 user nova-compute[70374]: realview-eb-mpcore Jul 27 09:26:07 user nova-compute[70374]: supermicrox11-bmc Jul 27 09:26:07 user nova-compute[70374]: virt-4.2 Jul 27 09:26:07 user nova-compute[70374]: witherspoon-bmc Jul 27 09:26:07 user nova-compute[70374]: mps3-an524 Jul 27 09:26:07 user nova-compute[70374]: swift-bmc Jul 27 09:26:07 user nova-compute[70374]: kudo-bmc Jul 27 09:26:07 user nova-compute[70374]: vexpress-a9 Jul 27 09:26:07 user nova-compute[70374]: midway Jul 27 09:26:07 user nova-compute[70374]: musicpal Jul 27 09:26:07 user nova-compute[70374]: lm3s811evb Jul 27 09:26:07 user nova-compute[70374]: lm3s6965evb Jul 27 09:26:07 user nova-compute[70374]: microbit Jul 27 09:26:07 user nova-compute[70374]: mps2-an505 Jul 27 09:26:07 user nova-compute[70374]: mps2-an385 Jul 27 09:26:07 user nova-compute[70374]: virt-6.0 Jul 27 09:26:07 user nova-compute[70374]: raspi3ap Jul 27 09:26:07 user nova-compute[70374]: cubieboard Jul 27 09:26:07 user nova-compute[70374]: verdex Jul 27 09:26:07 user nova-compute[70374]: netduino2 Jul 27 09:26:07 user nova-compute[70374]: xlnx-versal-virt Jul 27 09:26:07 user nova-compute[70374]: mps2-an386 Jul 27 09:26:07 user nova-compute[70374]: virt-6.1 Jul 27 09:26:07 user nova-compute[70374]: raspi3b Jul 27 09:26:07 user nova-compute[70374]: raspi2b Jul 27 09:26:07 user nova-compute[70374]: vexpress-a15 Jul 27 09:26:07 user nova-compute[70374]: fuji-bmc Jul 27 09:26:07 user nova-compute[70374]: virt-6.2 Jul 27 09:26:07 user nova-compute[70374]: virt Jul 27 09:26:07 user nova-compute[70374]: sonorapass-bmc Jul 27 09:26:07 user nova-compute[70374]: cheetah Jul 27 09:26:07 user nova-compute[70374]: virt-2.6 Jul 27 09:26:07 user nova-compute[70374]: ast2500-evb Jul 27 09:26:07 user nova-compute[70374]: highbank Jul 27 09:26:07 user nova-compute[70374]: akita Jul 27 09:26:07 user nova-compute[70374]: connex Jul 27 09:26:07 user nova-compute[70374]: netduinoplus2 Jul 27 09:26:07 user nova-compute[70374]: collie Jul 27 09:26:07 user nova-compute[70374]: raspi0 Jul 27 09:26:07 user nova-compute[70374]: fp5280g2-bmc Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: hvm Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: 32 Jul 27 09:26:07 user nova-compute[70374]: /usr/bin/qemu-system-cris Jul 27 09:26:07 user nova-compute[70374]: axis-dev88 Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: hvm Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: 32 Jul 27 09:26:07 user nova-compute[70374]: /usr/bin/qemu-system-i386 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-jammy Jul 27 09:26:07 user nova-compute[70374]: ubuntu Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-impish-hpb Jul 27 09:26:07 user nova-compute[70374]: pc-q35-5.2 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-2.12 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-2.0 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-xenial Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-6.2 Jul 27 09:26:07 user nova-compute[70374]: pc Jul 27 09:26:07 user nova-compute[70374]: pc-q35-4.2 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-2.5 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-4.2 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-focal Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-hirsute Jul 27 09:26:07 user nova-compute[70374]: pc-q35-xenial Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-jammy-hpb Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-5.2 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-1.5 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-2.7 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-eoan-hpb Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-zesty Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-disco-hpb Jul 27 09:26:07 user nova-compute[70374]: pc-q35-groovy Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-groovy Jul 27 09:26:07 user nova-compute[70374]: pc-q35-artful Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-2.2 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-trusty Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-eoan-hpb Jul 27 09:26:07 user nova-compute[70374]: pc-q35-focal-hpb Jul 27 09:26:07 user nova-compute[70374]: pc-q35-bionic-hpb Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-artful Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-2.7 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-6.1 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-yakkety Jul 27 09:26:07 user nova-compute[70374]: pc-q35-2.4 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-cosmic-hpb Jul 27 09:26:07 user nova-compute[70374]: pc-q35-2.10 Jul 27 09:26:07 user nova-compute[70374]: x-remote Jul 27 09:26:07 user nova-compute[70374]: pc-q35-5.1 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-1.7 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-2.9 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-2.11 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-3.1 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-6.1 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-4.1 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-jammy Jul 27 09:26:07 user nova-compute[70374]: ubuntu-q35 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-2.4 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-4.1 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-eoan Jul 27 09:26:07 user nova-compute[70374]: pc-q35-jammy-hpb Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-5.1 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-2.9 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-bionic-hpb Jul 27 09:26:07 user nova-compute[70374]: isapc Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-1.4 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-cosmic Jul 27 09:26:07 user nova-compute[70374]: pc-q35-2.6 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-3.1 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-bionic Jul 27 09:26:07 user nova-compute[70374]: pc-q35-disco-hpb Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-cosmic Jul 27 09:26:07 user nova-compute[70374]: pc-q35-2.12 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-bionic Jul 27 09:26:07 user nova-compute[70374]: pc-q35-groovy-hpb Jul 27 09:26:07 user nova-compute[70374]: pc-q35-disco Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-cosmic-hpb Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-2.1 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-wily Jul 27 09:26:07 user nova-compute[70374]: pc-q35-impish Jul 27 09:26:07 user nova-compute[70374]: pc-q35-6.0 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-impish Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-2.6 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-impish-hpb Jul 27 09:26:07 user nova-compute[70374]: pc-q35-hirsute Jul 27 09:26:07 user nova-compute[70374]: pc-q35-4.0.1 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-hirsute-hpb Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-1.6 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-5.0 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-2.8 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-2.10 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-3.0 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-6.0 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-zesty Jul 27 09:26:07 user nova-compute[70374]: pc-q35-4.0 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-focal Jul 27 09:26:07 user nova-compute[70374]: microvm Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-2.3 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-focal-hpb Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-disco Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-4.0 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-groovy-hpb Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-hirsute-hpb Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-5.0 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-6.2 Jul 27 09:26:07 user nova-compute[70374]: q35 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-2.8 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-eoan Jul 27 09:26:07 user nova-compute[70374]: pc-q35-2.5 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-3.0 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-yakkety Jul 27 09:26:07 user nova-compute[70374]: pc-q35-2.11 Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: hvm Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: 32 Jul 27 09:26:07 user nova-compute[70374]: /usr/bin/qemu-system-m68k Jul 27 09:26:07 user nova-compute[70374]: mcf5208evb Jul 27 09:26:07 user nova-compute[70374]: an5206 Jul 27 09:26:07 user nova-compute[70374]: virt-6.0 Jul 27 09:26:07 user nova-compute[70374]: q800 Jul 27 09:26:07 user nova-compute[70374]: virt-6.2 Jul 27 09:26:07 user nova-compute[70374]: virt Jul 27 09:26:07 user nova-compute[70374]: next-cube Jul 27 09:26:07 user nova-compute[70374]: virt-6.1 Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: hvm Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: 32 Jul 27 09:26:07 user nova-compute[70374]: /usr/bin/qemu-system-microblaze Jul 27 09:26:07 user nova-compute[70374]: petalogix-s3adsp1800 Jul 27 09:26:07 user nova-compute[70374]: petalogix-ml605 Jul 27 09:26:07 user nova-compute[70374]: xlnx-zynqmp-pmu Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: hvm Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: 32 Jul 27 09:26:07 user nova-compute[70374]: /usr/bin/qemu-system-microblazeel Jul 27 09:26:07 user nova-compute[70374]: petalogix-s3adsp1800 Jul 27 09:26:07 user nova-compute[70374]: petalogix-ml605 Jul 27 09:26:07 user nova-compute[70374]: xlnx-zynqmp-pmu Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: hvm Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: 32 Jul 27 09:26:07 user nova-compute[70374]: /usr/bin/qemu-system-mips Jul 27 09:26:07 user nova-compute[70374]: malta Jul 27 09:26:07 user nova-compute[70374]: mipssim Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: hvm Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: 32 Jul 27 09:26:07 user nova-compute[70374]: /usr/bin/qemu-system-mipsel Jul 27 09:26:07 user nova-compute[70374]: malta Jul 27 09:26:07 user nova-compute[70374]: mipssim Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: hvm Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: 64 Jul 27 09:26:07 user nova-compute[70374]: /usr/bin/qemu-system-mips64 Jul 27 09:26:07 user nova-compute[70374]: malta Jul 27 09:26:07 user nova-compute[70374]: mipssim Jul 27 09:26:07 user nova-compute[70374]: pica61 Jul 27 09:26:07 user nova-compute[70374]: magnum Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: hvm Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: 64 Jul 27 09:26:07 user nova-compute[70374]: /usr/bin/qemu-system-mips64el Jul 27 09:26:07 user nova-compute[70374]: malta Jul 27 09:26:07 user nova-compute[70374]: loongson3-virt Jul 27 09:26:07 user nova-compute[70374]: mipssim Jul 27 09:26:07 user nova-compute[70374]: pica61 Jul 27 09:26:07 user nova-compute[70374]: magnum Jul 27 09:26:07 user nova-compute[70374]: boston Jul 27 09:26:07 user nova-compute[70374]: fuloong2e Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: hvm Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: 32 Jul 27 09:26:07 user nova-compute[70374]: /usr/bin/qemu-system-ppc Jul 27 09:26:07 user nova-compute[70374]: g3beige Jul 27 09:26:07 user nova-compute[70374]: virtex-ml507 Jul 27 09:26:07 user nova-compute[70374]: mac99 Jul 27 09:26:07 user nova-compute[70374]: ppce500 Jul 27 09:26:07 user nova-compute[70374]: pegasos2 Jul 27 09:26:07 user nova-compute[70374]: sam460ex Jul 27 09:26:07 user nova-compute[70374]: bamboo Jul 27 09:26:07 user nova-compute[70374]: 40p Jul 27 09:26:07 user nova-compute[70374]: ref405ep Jul 27 09:26:07 user nova-compute[70374]: mpc8544ds Jul 27 09:26:07 user nova-compute[70374]: taihu Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: hvm Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: 64 Jul 27 09:26:07 user nova-compute[70374]: /usr/bin/qemu-system-ppc64 Jul 27 09:26:07 user nova-compute[70374]: pseries-jammy Jul 27 09:26:07 user nova-compute[70374]: pseries Jul 27 09:26:07 user nova-compute[70374]: powernv9 Jul 27 09:26:07 user nova-compute[70374]: powernv Jul 27 09:26:07 user nova-compute[70374]: taihu Jul 27 09:26:07 user nova-compute[70374]: pseries-4.1 Jul 27 09:26:07 user nova-compute[70374]: mpc8544ds Jul 27 09:26:07 user nova-compute[70374]: pseries-6.1 Jul 27 09:26:07 user nova-compute[70374]: pseries-2.5 Jul 27 09:26:07 user nova-compute[70374]: powernv10 Jul 27 09:26:07 user nova-compute[70374]: pseries-xenial Jul 27 09:26:07 user nova-compute[70374]: pseries-4.2 Jul 27 09:26:07 user nova-compute[70374]: pseries-6.2 Jul 27 09:26:07 user nova-compute[70374]: pseries-yakkety Jul 27 09:26:07 user nova-compute[70374]: pseries-2.6 Jul 27 09:26:07 user nova-compute[70374]: ppce500 Jul 27 09:26:07 user nova-compute[70374]: pseries-bionic-sxxm Jul 27 09:26:07 user nova-compute[70374]: pseries-2.7 Jul 27 09:26:07 user nova-compute[70374]: pseries-3.0 Jul 27 09:26:07 user nova-compute[70374]: pseries-5.0 Jul 27 09:26:07 user nova-compute[70374]: 40p Jul 27 09:26:07 user nova-compute[70374]: pseries-2.8 Jul 27 09:26:07 user nova-compute[70374]: pegasos2 Jul 27 09:26:07 user nova-compute[70374]: pseries-hirsute Jul 27 09:26:07 user nova-compute[70374]: pseries-3.1 Jul 27 09:26:07 user nova-compute[70374]: pseries-5.1 Jul 27 09:26:07 user nova-compute[70374]: pseries-eoan Jul 27 09:26:07 user nova-compute[70374]: pseries-2.9 Jul 27 09:26:07 user nova-compute[70374]: pseries-zesty Jul 27 09:26:07 user nova-compute[70374]: bamboo Jul 27 09:26:07 user nova-compute[70374]: pseries-groovy Jul 27 09:26:07 user nova-compute[70374]: pseries-focal Jul 27 09:26:07 user nova-compute[70374]: g3beige Jul 27 09:26:07 user nova-compute[70374]: pseries-5.2 Jul 27 09:26:07 user nova-compute[70374]: pseries-disco Jul 27 09:26:07 user nova-compute[70374]: pseries-2.12-sxxm Jul 27 09:26:07 user nova-compute[70374]: pseries-2.10 Jul 27 09:26:07 user nova-compute[70374]: virtex-ml507 Jul 27 09:26:07 user nova-compute[70374]: pseries-2.11 Jul 27 09:26:07 user nova-compute[70374]: pseries-2.1 Jul 27 09:26:07 user nova-compute[70374]: pseries-cosmic Jul 27 09:26:07 user nova-compute[70374]: pseries-bionic Jul 27 09:26:07 user nova-compute[70374]: pseries-2.12 Jul 27 09:26:07 user nova-compute[70374]: pseries-2.2 Jul 27 09:26:07 user nova-compute[70374]: mac99 Jul 27 09:26:07 user nova-compute[70374]: pseries-impish Jul 27 09:26:07 user nova-compute[70374]: pseries-artful Jul 27 09:26:07 user nova-compute[70374]: sam460ex Jul 27 09:26:07 user nova-compute[70374]: ref405ep Jul 27 09:26:07 user nova-compute[70374]: pseries-2.3 Jul 27 09:26:07 user nova-compute[70374]: powernv8 Jul 27 09:26:07 user nova-compute[70374]: pseries-4.0 Jul 27 09:26:07 user nova-compute[70374]: pseries-6.0 Jul 27 09:26:07 user nova-compute[70374]: pseries-2.4 Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: hvm Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: 64 Jul 27 09:26:07 user nova-compute[70374]: /usr/bin/qemu-system-ppc64le Jul 27 09:26:07 user nova-compute[70374]: pseries-jammy Jul 27 09:26:07 user nova-compute[70374]: pseries Jul 27 09:26:07 user nova-compute[70374]: powernv9 Jul 27 09:26:07 user nova-compute[70374]: powernv Jul 27 09:26:07 user nova-compute[70374]: taihu Jul 27 09:26:07 user nova-compute[70374]: pseries-4.1 Jul 27 09:26:07 user nova-compute[70374]: mpc8544ds Jul 27 09:26:07 user nova-compute[70374]: pseries-6.1 Jul 27 09:26:07 user nova-compute[70374]: pseries-2.5 Jul 27 09:26:07 user nova-compute[70374]: powernv10 Jul 27 09:26:07 user nova-compute[70374]: pseries-xenial Jul 27 09:26:07 user nova-compute[70374]: pseries-4.2 Jul 27 09:26:07 user nova-compute[70374]: pseries-6.2 Jul 27 09:26:07 user nova-compute[70374]: pseries-yakkety Jul 27 09:26:07 user nova-compute[70374]: pseries-2.6 Jul 27 09:26:07 user nova-compute[70374]: ppce500 Jul 27 09:26:07 user nova-compute[70374]: pseries-bionic-sxxm Jul 27 09:26:07 user nova-compute[70374]: pseries-2.7 Jul 27 09:26:07 user nova-compute[70374]: pseries-3.0 Jul 27 09:26:07 user nova-compute[70374]: pseries-5.0 Jul 27 09:26:07 user nova-compute[70374]: 40p Jul 27 09:26:07 user nova-compute[70374]: pseries-2.8 Jul 27 09:26:07 user nova-compute[70374]: pegasos2 Jul 27 09:26:07 user nova-compute[70374]: pseries-hirsute Jul 27 09:26:07 user nova-compute[70374]: pseries-3.1 Jul 27 09:26:07 user nova-compute[70374]: pseries-5.1 Jul 27 09:26:07 user nova-compute[70374]: pseries-eoan Jul 27 09:26:07 user nova-compute[70374]: pseries-2.9 Jul 27 09:26:07 user nova-compute[70374]: pseries-zesty Jul 27 09:26:07 user nova-compute[70374]: bamboo Jul 27 09:26:07 user nova-compute[70374]: pseries-groovy Jul 27 09:26:07 user nova-compute[70374]: pseries-focal Jul 27 09:26:07 user nova-compute[70374]: g3beige Jul 27 09:26:07 user nova-compute[70374]: pseries-5.2 Jul 27 09:26:07 user nova-compute[70374]: pseries-disco Jul 27 09:26:07 user nova-compute[70374]: pseries-2.12-sxxm Jul 27 09:26:07 user nova-compute[70374]: pseries-2.10 Jul 27 09:26:07 user nova-compute[70374]: virtex-ml507 Jul 27 09:26:07 user nova-compute[70374]: pseries-2.11 Jul 27 09:26:07 user nova-compute[70374]: pseries-2.1 Jul 27 09:26:07 user nova-compute[70374]: pseries-cosmic Jul 27 09:26:07 user nova-compute[70374]: pseries-bionic Jul 27 09:26:07 user nova-compute[70374]: pseries-2.12 Jul 27 09:26:07 user nova-compute[70374]: pseries-2.2 Jul 27 09:26:07 user nova-compute[70374]: mac99 Jul 27 09:26:07 user nova-compute[70374]: pseries-impish Jul 27 09:26:07 user nova-compute[70374]: pseries-artful Jul 27 09:26:07 user nova-compute[70374]: sam460ex Jul 27 09:26:07 user nova-compute[70374]: ref405ep Jul 27 09:26:07 user nova-compute[70374]: pseries-2.3 Jul 27 09:26:07 user nova-compute[70374]: powernv8 Jul 27 09:26:07 user nova-compute[70374]: pseries-4.0 Jul 27 09:26:07 user nova-compute[70374]: pseries-6.0 Jul 27 09:26:07 user nova-compute[70374]: pseries-2.4 Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: hvm Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: 32 Jul 27 09:26:07 user nova-compute[70374]: /usr/bin/qemu-system-riscv32 Jul 27 09:26:07 user nova-compute[70374]: spike Jul 27 09:26:07 user nova-compute[70374]: opentitan Jul 27 09:26:07 user nova-compute[70374]: sifive_u Jul 27 09:26:07 user nova-compute[70374]: sifive_e Jul 27 09:26:07 user nova-compute[70374]: virt Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: hvm Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: 64 Jul 27 09:26:07 user nova-compute[70374]: /usr/bin/qemu-system-riscv64 Jul 27 09:26:07 user nova-compute[70374]: spike Jul 27 09:26:07 user nova-compute[70374]: microchip-icicle-kit Jul 27 09:26:07 user nova-compute[70374]: sifive_u Jul 27 09:26:07 user nova-compute[70374]: shakti_c Jul 27 09:26:07 user nova-compute[70374]: sifive_e Jul 27 09:26:07 user nova-compute[70374]: virt Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: hvm Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: 64 Jul 27 09:26:07 user nova-compute[70374]: /usr/bin/qemu-system-s390x Jul 27 09:26:07 user nova-compute[70374]: s390-ccw-virtio-jammy Jul 27 09:26:07 user nova-compute[70374]: s390-ccw-virtio Jul 27 09:26:07 user nova-compute[70374]: s390-ccw-virtio-4.0 Jul 27 09:26:07 user nova-compute[70374]: s390-ccw-virtio-5.2 Jul 27 09:26:07 user nova-compute[70374]: s390-ccw-virtio-artful Jul 27 09:26:07 user nova-compute[70374]: s390-ccw-virtio-3.1 Jul 27 09:26:07 user nova-compute[70374]: s390-ccw-virtio-groovy Jul 27 09:26:07 user nova-compute[70374]: s390-ccw-virtio-hirsute Jul 27 09:26:07 user nova-compute[70374]: s390-ccw-virtio-disco Jul 27 09:26:07 user nova-compute[70374]: s390-ccw-virtio-2.12 Jul 27 09:26:07 user nova-compute[70374]: s390-ccw-virtio-2.6 Jul 27 09:26:07 user nova-compute[70374]: s390-ccw-virtio-yakkety Jul 27 09:26:07 user nova-compute[70374]: s390-ccw-virtio-eoan Jul 27 09:26:07 user nova-compute[70374]: s390-ccw-virtio-2.9 Jul 27 09:26:07 user nova-compute[70374]: s390-ccw-virtio-6.0 Jul 27 09:26:07 user nova-compute[70374]: s390-ccw-virtio-5.1 Jul 27 09:26:07 user nova-compute[70374]: s390-ccw-virtio-3.0 Jul 27 09:26:07 user nova-compute[70374]: s390-ccw-virtio-4.2 Jul 27 09:26:07 user nova-compute[70374]: s390-ccw-virtio-2.5 Jul 27 09:26:07 user nova-compute[70374]: s390-ccw-virtio-2.11 Jul 27 09:26:07 user nova-compute[70374]: s390-ccw-virtio-xenial Jul 27 09:26:07 user nova-compute[70374]: s390-ccw-virtio-focal Jul 27 09:26:07 user nova-compute[70374]: s390-ccw-virtio-2.8 Jul 27 09:26:07 user nova-compute[70374]: s390-ccw-virtio-impish Jul 27 09:26:07 user nova-compute[70374]: s390-ccw-virtio-bionic Jul 27 09:26:07 user nova-compute[70374]: s390-ccw-virtio-5.0 Jul 27 09:26:07 user nova-compute[70374]: s390-ccw-virtio-6.2 Jul 27 09:26:07 user nova-compute[70374]: s390-ccw-virtio-zesty Jul 27 09:26:07 user nova-compute[70374]: s390-ccw-virtio-4.1 Jul 27 09:26:07 user nova-compute[70374]: s390-ccw-virtio-cosmic Jul 27 09:26:07 user nova-compute[70374]: s390-ccw-virtio-2.4 Jul 27 09:26:07 user nova-compute[70374]: s390-ccw-virtio-2.10 Jul 27 09:26:07 user nova-compute[70374]: s390-ccw-virtio-2.7 Jul 27 09:26:07 user nova-compute[70374]: s390-ccw-virtio-6.1 Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: hvm Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: 32 Jul 27 09:26:07 user nova-compute[70374]: /usr/bin/qemu-system-sh4 Jul 27 09:26:07 user nova-compute[70374]: shix Jul 27 09:26:07 user nova-compute[70374]: r2d Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: hvm Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: 64 Jul 27 09:26:07 user nova-compute[70374]: /usr/bin/qemu-system-sh4eb Jul 27 09:26:07 user nova-compute[70374]: shix Jul 27 09:26:07 user nova-compute[70374]: r2d Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: hvm Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: 32 Jul 27 09:26:07 user nova-compute[70374]: /usr/bin/qemu-system-sparc Jul 27 09:26:07 user nova-compute[70374]: SS-5 Jul 27 09:26:07 user nova-compute[70374]: SS-20 Jul 27 09:26:07 user nova-compute[70374]: LX Jul 27 09:26:07 user nova-compute[70374]: SPARCClassic Jul 27 09:26:07 user nova-compute[70374]: leon3_generic Jul 27 09:26:07 user nova-compute[70374]: SPARCbook Jul 27 09:26:07 user nova-compute[70374]: SS-4 Jul 27 09:26:07 user nova-compute[70374]: SS-600MP Jul 27 09:26:07 user nova-compute[70374]: SS-10 Jul 27 09:26:07 user nova-compute[70374]: Voyager Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: hvm Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: 64 Jul 27 09:26:07 user nova-compute[70374]: /usr/bin/qemu-system-sparc64 Jul 27 09:26:07 user nova-compute[70374]: sun4u Jul 27 09:26:07 user nova-compute[70374]: niagara Jul 27 09:26:07 user nova-compute[70374]: sun4v Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: hvm Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: 64 Jul 27 09:26:07 user nova-compute[70374]: /usr/bin/qemu-system-x86_64 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-jammy Jul 27 09:26:07 user nova-compute[70374]: ubuntu Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-impish-hpb Jul 27 09:26:07 user nova-compute[70374]: pc-q35-5.2 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-2.12 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-2.0 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-xenial Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-6.2 Jul 27 09:26:07 user nova-compute[70374]: pc Jul 27 09:26:07 user nova-compute[70374]: pc-q35-4.2 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-2.5 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-4.2 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-hirsute Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-focal Jul 27 09:26:07 user nova-compute[70374]: pc-q35-xenial Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-jammy-hpb Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-5.2 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-1.5 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-2.7 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-eoan-hpb Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-zesty Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-disco-hpb Jul 27 09:26:07 user nova-compute[70374]: pc-q35-groovy Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-groovy Jul 27 09:26:07 user nova-compute[70374]: pc-q35-artful Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-trusty Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-2.2 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-focal-hpb Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-eoan-hpb Jul 27 09:26:07 user nova-compute[70374]: pc-q35-bionic-hpb Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-artful Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-2.7 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-6.1 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-yakkety Jul 27 09:26:07 user nova-compute[70374]: pc-q35-2.4 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-cosmic-hpb Jul 27 09:26:07 user nova-compute[70374]: pc-q35-2.10 Jul 27 09:26:07 user nova-compute[70374]: x-remote Jul 27 09:26:07 user nova-compute[70374]: pc-q35-5.1 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-1.7 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-2.9 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-2.11 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-3.1 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-6.1 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-4.1 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-jammy Jul 27 09:26:07 user nova-compute[70374]: ubuntu-q35 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-2.4 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-4.1 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-eoan Jul 27 09:26:07 user nova-compute[70374]: pc-q35-jammy-hpb Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-5.1 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-2.9 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-bionic-hpb Jul 27 09:26:07 user nova-compute[70374]: isapc Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-1.4 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-cosmic Jul 27 09:26:07 user nova-compute[70374]: pc-q35-2.6 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-3.1 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-bionic Jul 27 09:26:07 user nova-compute[70374]: pc-q35-disco-hpb Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-cosmic Jul 27 09:26:07 user nova-compute[70374]: pc-q35-2.12 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-bionic Jul 27 09:26:07 user nova-compute[70374]: pc-q35-groovy-hpb Jul 27 09:26:07 user nova-compute[70374]: pc-q35-disco Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-cosmic-hpb Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-2.1 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-wily Jul 27 09:26:07 user nova-compute[70374]: pc-q35-impish Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-2.6 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-6.0 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-impish Jul 27 09:26:07 user nova-compute[70374]: pc-q35-impish-hpb Jul 27 09:26:07 user nova-compute[70374]: pc-q35-hirsute Jul 27 09:26:07 user nova-compute[70374]: pc-q35-4.0.1 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-hirsute-hpb Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-1.6 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-5.0 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-2.8 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-2.10 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-3.0 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-zesty Jul 27 09:26:07 user nova-compute[70374]: pc-q35-4.0 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-focal Jul 27 09:26:07 user nova-compute[70374]: microvm Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-6.0 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-2.3 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-disco Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-focal-hpb Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-4.0 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-groovy-hpb Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-hirsute-hpb Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-5.0 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-2.8 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-6.2 Jul 27 09:26:07 user nova-compute[70374]: q35 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-eoan Jul 27 09:26:07 user nova-compute[70374]: pc-q35-2.5 Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-3.0 Jul 27 09:26:07 user nova-compute[70374]: pc-q35-yakkety Jul 27 09:26:07 user nova-compute[70374]: pc-q35-2.11 Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: hvm Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: 32 Jul 27 09:26:07 user nova-compute[70374]: /usr/bin/qemu-system-xtensa Jul 27 09:26:07 user nova-compute[70374]: sim Jul 27 09:26:07 user nova-compute[70374]: kc705 Jul 27 09:26:07 user nova-compute[70374]: ml605 Jul 27 09:26:07 user nova-compute[70374]: ml605-nommu Jul 27 09:26:07 user nova-compute[70374]: virt Jul 27 09:26:07 user nova-compute[70374]: lx60-nommu Jul 27 09:26:07 user nova-compute[70374]: lx200 Jul 27 09:26:07 user nova-compute[70374]: lx200-nommu Jul 27 09:26:07 user nova-compute[70374]: lx60 Jul 27 09:26:07 user nova-compute[70374]: kc705-nommu Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: hvm Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: 32 Jul 27 09:26:07 user nova-compute[70374]: /usr/bin/qemu-system-xtensaeb Jul 27 09:26:07 user nova-compute[70374]: sim Jul 27 09:26:07 user nova-compute[70374]: kc705 Jul 27 09:26:07 user nova-compute[70374]: ml605 Jul 27 09:26:07 user nova-compute[70374]: ml605-nommu Jul 27 09:26:07 user nova-compute[70374]: virt Jul 27 09:26:07 user nova-compute[70374]: lx60-nommu Jul 27 09:26:07 user nova-compute[70374]: lx200 Jul 27 09:26:07 user nova-compute[70374]: lx200-nommu Jul 27 09:26:07 user nova-compute[70374]: lx60 Jul 27 09:26:07 user nova-compute[70374]: kc705-nommu Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Getting domain capabilities for alpha via machine types: {None} {{(pid=70374) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Error from libvirt when retrieving domain capabilities for arch alpha / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-alpha' on this host {{(pid=70374) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Getting domain capabilities for armv6l via machine types: {'virt', None} {{(pid=70374) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Error from libvirt when retrieving domain capabilities for arch armv6l / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-arm' on this host {{(pid=70374) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Error from libvirt when retrieving domain capabilities for arch armv6l / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-arm' on this host {{(pid=70374) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Getting domain capabilities for armv7l via machine types: {'virt'} {{(pid=70374) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Error from libvirt when retrieving domain capabilities for arch armv7l / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-arm' on this host {{(pid=70374) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Getting domain capabilities for aarch64 via machine types: {'virt'} {{(pid=70374) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Error from libvirt when retrieving domain capabilities for arch aarch64 / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-aarch64' on this host {{(pid=70374) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Getting domain capabilities for cris via machine types: {None} {{(pid=70374) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Error from libvirt when retrieving domain capabilities for arch cris / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-cris' on this host {{(pid=70374) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Getting domain capabilities for i686 via machine types: {'ubuntu-q35', 'pc', 'q35', 'ubuntu'} {{(pid=70374) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=ubuntu-q35: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: /usr/bin/qemu-system-i386 Jul 27 09:26:07 user nova-compute[70374]: kvm Jul 27 09:26:07 user nova-compute[70374]: pc-q35-jammy Jul 27 09:26:07 user nova-compute[70374]: i686 Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: /usr/share/OVMF/OVMF_CODE.fd Jul 27 09:26:07 user nova-compute[70374]: /usr/share/OVMF/OVMF_CODE.secboot.fd Jul 27 09:26:07 user nova-compute[70374]: /usr/share/AAVMF/AAVMF_CODE.fd Jul 27 09:26:07 user nova-compute[70374]: /usr/share/AAVMF/AAVMF32_CODE.fd Jul 27 09:26:07 user nova-compute[70374]: /usr/share/OVMF/OVMF_CODE.ms.fd Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: rom Jul 27 09:26:07 user nova-compute[70374]: pflash Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: yes Jul 27 09:26:07 user nova-compute[70374]: no Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: no Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: off Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: on Jul 27 09:26:07 user nova-compute[70374]: off Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: IvyBridge-IBRS Jul 27 09:26:07 user nova-compute[70374]: Intel Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: qemu64 Jul 27 09:26:07 user nova-compute[70374]: qemu32 Jul 27 09:26:07 user nova-compute[70374]: phenom Jul 27 09:26:07 user nova-compute[70374]: pentium3 Jul 27 09:26:07 user nova-compute[70374]: pentium2 Jul 27 09:26:07 user nova-compute[70374]: pentium Jul 27 09:26:07 user nova-compute[70374]: n270 Jul 27 09:26:07 user nova-compute[70374]: kvm64 Jul 27 09:26:07 user nova-compute[70374]: kvm32 Jul 27 09:26:07 user nova-compute[70374]: coreduo Jul 27 09:26:07 user nova-compute[70374]: core2duo Jul 27 09:26:07 user nova-compute[70374]: athlon Jul 27 09:26:07 user nova-compute[70374]: Westmere-IBRS Jul 27 09:26:07 user nova-compute[70374]: Westmere Jul 27 09:26:07 user nova-compute[70374]: Snowridge Jul 27 09:26:07 user nova-compute[70374]: Skylake-Server-noTSX-IBRS Jul 27 09:26:07 user nova-compute[70374]: Skylake-Server-IBRS Jul 27 09:26:07 user nova-compute[70374]: Skylake-Server Jul 27 09:26:07 user nova-compute[70374]: Skylake-Client-noTSX-IBRS Jul 27 09:26:07 user nova-compute[70374]: Skylake-Client-IBRS Jul 27 09:26:07 user nova-compute[70374]: Skylake-Client Jul 27 09:26:07 user nova-compute[70374]: SandyBridge-IBRS Jul 27 09:26:07 user nova-compute[70374]: SandyBridge Jul 27 09:26:07 user nova-compute[70374]: Penryn Jul 27 09:26:07 user nova-compute[70374]: Opteron_G5 Jul 27 09:26:07 user nova-compute[70374]: Opteron_G4 Jul 27 09:26:07 user nova-compute[70374]: Opteron_G3 Jul 27 09:26:07 user nova-compute[70374]: Opteron_G2 Jul 27 09:26:07 user nova-compute[70374]: Opteron_G1 Jul 27 09:26:07 user nova-compute[70374]: Nehalem-IBRS Jul 27 09:26:07 user nova-compute[70374]: Nehalem Jul 27 09:26:07 user nova-compute[70374]: IvyBridge-IBRS Jul 27 09:26:07 user nova-compute[70374]: IvyBridge Jul 27 09:26:07 user nova-compute[70374]: Icelake-Server-noTSX Jul 27 09:26:07 user nova-compute[70374]: Icelake-Server Jul 27 09:26:07 user nova-compute[70374]: Icelake-Client-noTSX Jul 27 09:26:07 user nova-compute[70374]: Icelake-Client Jul 27 09:26:07 user nova-compute[70374]: Haswell-noTSX-IBRS Jul 27 09:26:07 user nova-compute[70374]: Haswell-noTSX Jul 27 09:26:07 user nova-compute[70374]: Haswell-IBRS Jul 27 09:26:07 user nova-compute[70374]: Haswell Jul 27 09:26:07 user nova-compute[70374]: EPYC-Rome Jul 27 09:26:07 user nova-compute[70374]: EPYC-Milan Jul 27 09:26:07 user nova-compute[70374]: EPYC-IBPB Jul 27 09:26:07 user nova-compute[70374]: EPYC Jul 27 09:26:07 user nova-compute[70374]: Dhyana Jul 27 09:26:07 user nova-compute[70374]: Cooperlake Jul 27 09:26:07 user nova-compute[70374]: Conroe Jul 27 09:26:07 user nova-compute[70374]: Cascadelake-Server-noTSX Jul 27 09:26:07 user nova-compute[70374]: Cascadelake-Server Jul 27 09:26:07 user nova-compute[70374]: Broadwell-noTSX-IBRS Jul 27 09:26:07 user nova-compute[70374]: Broadwell-noTSX Jul 27 09:26:07 user nova-compute[70374]: Broadwell-IBRS Jul 27 09:26:07 user nova-compute[70374]: Broadwell Jul 27 09:26:07 user nova-compute[70374]: 486 Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: file Jul 27 09:26:07 user nova-compute[70374]: anonymous Jul 27 09:26:07 user nova-compute[70374]: memfd Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: disk Jul 27 09:26:07 user nova-compute[70374]: cdrom Jul 27 09:26:07 user nova-compute[70374]: floppy Jul 27 09:26:07 user nova-compute[70374]: lun Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: fdc Jul 27 09:26:07 user nova-compute[70374]: scsi Jul 27 09:26:07 user nova-compute[70374]: virtio Jul 27 09:26:07 user nova-compute[70374]: usb Jul 27 09:26:07 user nova-compute[70374]: sata Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: virtio Jul 27 09:26:07 user nova-compute[70374]: virtio-transitional Jul 27 09:26:07 user nova-compute[70374]: virtio-non-transitional Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: sdl Jul 27 09:26:07 user nova-compute[70374]: vnc Jul 27 09:26:07 user nova-compute[70374]: spice Jul 27 09:26:07 user nova-compute[70374]: egl-headless Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: subsystem Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: default Jul 27 09:26:07 user nova-compute[70374]: mandatory Jul 27 09:26:07 user nova-compute[70374]: requisite Jul 27 09:26:07 user nova-compute[70374]: optional Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: usb Jul 27 09:26:07 user nova-compute[70374]: pci Jul 27 09:26:07 user nova-compute[70374]: scsi Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: virtio Jul 27 09:26:07 user nova-compute[70374]: virtio-transitional Jul 27 09:26:07 user nova-compute[70374]: virtio-non-transitional Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: random Jul 27 09:26:07 user nova-compute[70374]: egd Jul 27 09:26:07 user nova-compute[70374]: builtin Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: path Jul 27 09:26:07 user nova-compute[70374]: handle Jul 27 09:26:07 user nova-compute[70374]: virtiofs Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: tpm-tis Jul 27 09:26:07 user nova-compute[70374]: tpm-crb Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: passthrough Jul 27 09:26:07 user nova-compute[70374]: emulator Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: {{(pid=70374) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: /usr/bin/qemu-system-i386 Jul 27 09:26:07 user nova-compute[70374]: kvm Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-6.2 Jul 27 09:26:07 user nova-compute[70374]: i686 Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: /usr/share/OVMF/OVMF_CODE.fd Jul 27 09:26:07 user nova-compute[70374]: /usr/share/OVMF/OVMF_CODE.secboot.fd Jul 27 09:26:07 user nova-compute[70374]: /usr/share/AAVMF/AAVMF_CODE.fd Jul 27 09:26:07 user nova-compute[70374]: /usr/share/AAVMF/AAVMF32_CODE.fd Jul 27 09:26:07 user nova-compute[70374]: /usr/share/OVMF/OVMF_CODE.ms.fd Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: rom Jul 27 09:26:07 user nova-compute[70374]: pflash Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: yes Jul 27 09:26:07 user nova-compute[70374]: no Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: no Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: off Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: on Jul 27 09:26:07 user nova-compute[70374]: off Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: IvyBridge-IBRS Jul 27 09:26:07 user nova-compute[70374]: Intel Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: qemu64 Jul 27 09:26:07 user nova-compute[70374]: qemu32 Jul 27 09:26:07 user nova-compute[70374]: phenom Jul 27 09:26:07 user nova-compute[70374]: pentium3 Jul 27 09:26:07 user nova-compute[70374]: pentium2 Jul 27 09:26:07 user nova-compute[70374]: pentium Jul 27 09:26:07 user nova-compute[70374]: n270 Jul 27 09:26:07 user nova-compute[70374]: kvm64 Jul 27 09:26:07 user nova-compute[70374]: kvm32 Jul 27 09:26:07 user nova-compute[70374]: coreduo Jul 27 09:26:07 user nova-compute[70374]: core2duo Jul 27 09:26:07 user nova-compute[70374]: athlon Jul 27 09:26:07 user nova-compute[70374]: Westmere-IBRS Jul 27 09:26:07 user nova-compute[70374]: Westmere Jul 27 09:26:07 user nova-compute[70374]: Snowridge Jul 27 09:26:07 user nova-compute[70374]: Skylake-Server-noTSX-IBRS Jul 27 09:26:07 user nova-compute[70374]: Skylake-Server-IBRS Jul 27 09:26:07 user nova-compute[70374]: Skylake-Server Jul 27 09:26:07 user nova-compute[70374]: Skylake-Client-noTSX-IBRS Jul 27 09:26:07 user nova-compute[70374]: Skylake-Client-IBRS Jul 27 09:26:07 user nova-compute[70374]: Skylake-Client Jul 27 09:26:07 user nova-compute[70374]: SandyBridge-IBRS Jul 27 09:26:07 user nova-compute[70374]: SandyBridge Jul 27 09:26:07 user nova-compute[70374]: Penryn Jul 27 09:26:07 user nova-compute[70374]: Opteron_G5 Jul 27 09:26:07 user nova-compute[70374]: Opteron_G4 Jul 27 09:26:07 user nova-compute[70374]: Opteron_G3 Jul 27 09:26:07 user nova-compute[70374]: Opteron_G2 Jul 27 09:26:07 user nova-compute[70374]: Opteron_G1 Jul 27 09:26:07 user nova-compute[70374]: Nehalem-IBRS Jul 27 09:26:07 user nova-compute[70374]: Nehalem Jul 27 09:26:07 user nova-compute[70374]: IvyBridge-IBRS Jul 27 09:26:07 user nova-compute[70374]: IvyBridge Jul 27 09:26:07 user nova-compute[70374]: Icelake-Server-noTSX Jul 27 09:26:07 user nova-compute[70374]: Icelake-Server Jul 27 09:26:07 user nova-compute[70374]: Icelake-Client-noTSX Jul 27 09:26:07 user nova-compute[70374]: Icelake-Client Jul 27 09:26:07 user nova-compute[70374]: Haswell-noTSX-IBRS Jul 27 09:26:07 user nova-compute[70374]: Haswell-noTSX Jul 27 09:26:07 user nova-compute[70374]: Haswell-IBRS Jul 27 09:26:07 user nova-compute[70374]: Haswell Jul 27 09:26:07 user nova-compute[70374]: EPYC-Rome Jul 27 09:26:07 user nova-compute[70374]: EPYC-Milan Jul 27 09:26:07 user nova-compute[70374]: EPYC-IBPB Jul 27 09:26:07 user nova-compute[70374]: EPYC Jul 27 09:26:07 user nova-compute[70374]: Dhyana Jul 27 09:26:07 user nova-compute[70374]: Cooperlake Jul 27 09:26:07 user nova-compute[70374]: Conroe Jul 27 09:26:07 user nova-compute[70374]: Cascadelake-Server-noTSX Jul 27 09:26:07 user nova-compute[70374]: Cascadelake-Server Jul 27 09:26:07 user nova-compute[70374]: Broadwell-noTSX-IBRS Jul 27 09:26:07 user nova-compute[70374]: Broadwell-noTSX Jul 27 09:26:07 user nova-compute[70374]: Broadwell-IBRS Jul 27 09:26:07 user nova-compute[70374]: Broadwell Jul 27 09:26:07 user nova-compute[70374]: 486 Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: file Jul 27 09:26:07 user nova-compute[70374]: anonymous Jul 27 09:26:07 user nova-compute[70374]: memfd Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: disk Jul 27 09:26:07 user nova-compute[70374]: cdrom Jul 27 09:26:07 user nova-compute[70374]: floppy Jul 27 09:26:07 user nova-compute[70374]: lun Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: ide Jul 27 09:26:07 user nova-compute[70374]: fdc Jul 27 09:26:07 user nova-compute[70374]: scsi Jul 27 09:26:07 user nova-compute[70374]: virtio Jul 27 09:26:07 user nova-compute[70374]: usb Jul 27 09:26:07 user nova-compute[70374]: sata Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: virtio Jul 27 09:26:07 user nova-compute[70374]: virtio-transitional Jul 27 09:26:07 user nova-compute[70374]: virtio-non-transitional Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: sdl Jul 27 09:26:07 user nova-compute[70374]: vnc Jul 27 09:26:07 user nova-compute[70374]: spice Jul 27 09:26:07 user nova-compute[70374]: egl-headless Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: subsystem Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: default Jul 27 09:26:07 user nova-compute[70374]: mandatory Jul 27 09:26:07 user nova-compute[70374]: requisite Jul 27 09:26:07 user nova-compute[70374]: optional Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: usb Jul 27 09:26:07 user nova-compute[70374]: pci Jul 27 09:26:07 user nova-compute[70374]: scsi Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: virtio Jul 27 09:26:07 user nova-compute[70374]: virtio-transitional Jul 27 09:26:07 user nova-compute[70374]: virtio-non-transitional Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: random Jul 27 09:26:07 user nova-compute[70374]: egd Jul 27 09:26:07 user nova-compute[70374]: builtin Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: path Jul 27 09:26:07 user nova-compute[70374]: handle Jul 27 09:26:07 user nova-compute[70374]: virtiofs Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: tpm-tis Jul 27 09:26:07 user nova-compute[70374]: tpm-crb Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: passthrough Jul 27 09:26:07 user nova-compute[70374]: emulator Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: {{(pid=70374) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: /usr/bin/qemu-system-i386 Jul 27 09:26:07 user nova-compute[70374]: kvm Jul 27 09:26:07 user nova-compute[70374]: pc-q35-6.2 Jul 27 09:26:07 user nova-compute[70374]: i686 Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: /usr/share/OVMF/OVMF_CODE.fd Jul 27 09:26:07 user nova-compute[70374]: /usr/share/OVMF/OVMF_CODE.secboot.fd Jul 27 09:26:07 user nova-compute[70374]: /usr/share/AAVMF/AAVMF_CODE.fd Jul 27 09:26:07 user nova-compute[70374]: /usr/share/AAVMF/AAVMF32_CODE.fd Jul 27 09:26:07 user nova-compute[70374]: /usr/share/OVMF/OVMF_CODE.ms.fd Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: rom Jul 27 09:26:07 user nova-compute[70374]: pflash Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: yes Jul 27 09:26:07 user nova-compute[70374]: no Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: no Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: off Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: on Jul 27 09:26:07 user nova-compute[70374]: off Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: IvyBridge-IBRS Jul 27 09:26:07 user nova-compute[70374]: Intel Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: qemu64 Jul 27 09:26:07 user nova-compute[70374]: qemu32 Jul 27 09:26:07 user nova-compute[70374]: phenom Jul 27 09:26:07 user nova-compute[70374]: pentium3 Jul 27 09:26:07 user nova-compute[70374]: pentium2 Jul 27 09:26:07 user nova-compute[70374]: pentium Jul 27 09:26:07 user nova-compute[70374]: n270 Jul 27 09:26:07 user nova-compute[70374]: kvm64 Jul 27 09:26:07 user nova-compute[70374]: kvm32 Jul 27 09:26:07 user nova-compute[70374]: coreduo Jul 27 09:26:07 user nova-compute[70374]: core2duo Jul 27 09:26:07 user nova-compute[70374]: athlon Jul 27 09:26:07 user nova-compute[70374]: Westmere-IBRS Jul 27 09:26:07 user nova-compute[70374]: Westmere Jul 27 09:26:07 user nova-compute[70374]: Snowridge Jul 27 09:26:07 user nova-compute[70374]: Skylake-Server-noTSX-IBRS Jul 27 09:26:07 user nova-compute[70374]: Skylake-Server-IBRS Jul 27 09:26:07 user nova-compute[70374]: Skylake-Server Jul 27 09:26:07 user nova-compute[70374]: Skylake-Client-noTSX-IBRS Jul 27 09:26:07 user nova-compute[70374]: Skylake-Client-IBRS Jul 27 09:26:07 user nova-compute[70374]: Skylake-Client Jul 27 09:26:07 user nova-compute[70374]: SandyBridge-IBRS Jul 27 09:26:07 user nova-compute[70374]: SandyBridge Jul 27 09:26:07 user nova-compute[70374]: Penryn Jul 27 09:26:07 user nova-compute[70374]: Opteron_G5 Jul 27 09:26:07 user nova-compute[70374]: Opteron_G4 Jul 27 09:26:07 user nova-compute[70374]: Opteron_G3 Jul 27 09:26:07 user nova-compute[70374]: Opteron_G2 Jul 27 09:26:07 user nova-compute[70374]: Opteron_G1 Jul 27 09:26:07 user nova-compute[70374]: Nehalem-IBRS Jul 27 09:26:07 user nova-compute[70374]: Nehalem Jul 27 09:26:07 user nova-compute[70374]: IvyBridge-IBRS Jul 27 09:26:07 user nova-compute[70374]: IvyBridge Jul 27 09:26:07 user nova-compute[70374]: Icelake-Server-noTSX Jul 27 09:26:07 user nova-compute[70374]: Icelake-Server Jul 27 09:26:07 user nova-compute[70374]: Icelake-Client-noTSX Jul 27 09:26:07 user nova-compute[70374]: Icelake-Client Jul 27 09:26:07 user nova-compute[70374]: Haswell-noTSX-IBRS Jul 27 09:26:07 user nova-compute[70374]: Haswell-noTSX Jul 27 09:26:07 user nova-compute[70374]: Haswell-IBRS Jul 27 09:26:07 user nova-compute[70374]: Haswell Jul 27 09:26:07 user nova-compute[70374]: EPYC-Rome Jul 27 09:26:07 user nova-compute[70374]: EPYC-Milan Jul 27 09:26:07 user nova-compute[70374]: EPYC-IBPB Jul 27 09:26:07 user nova-compute[70374]: EPYC Jul 27 09:26:07 user nova-compute[70374]: Dhyana Jul 27 09:26:07 user nova-compute[70374]: Cooperlake Jul 27 09:26:07 user nova-compute[70374]: Conroe Jul 27 09:26:07 user nova-compute[70374]: Cascadelake-Server-noTSX Jul 27 09:26:07 user nova-compute[70374]: Cascadelake-Server Jul 27 09:26:07 user nova-compute[70374]: Broadwell-noTSX-IBRS Jul 27 09:26:07 user nova-compute[70374]: Broadwell-noTSX Jul 27 09:26:07 user nova-compute[70374]: Broadwell-IBRS Jul 27 09:26:07 user nova-compute[70374]: Broadwell Jul 27 09:26:07 user nova-compute[70374]: 486 Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: file Jul 27 09:26:07 user nova-compute[70374]: anonymous Jul 27 09:26:07 user nova-compute[70374]: memfd Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: disk Jul 27 09:26:07 user nova-compute[70374]: cdrom Jul 27 09:26:07 user nova-compute[70374]: floppy Jul 27 09:26:07 user nova-compute[70374]: lun Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: fdc Jul 27 09:26:07 user nova-compute[70374]: scsi Jul 27 09:26:07 user nova-compute[70374]: virtio Jul 27 09:26:07 user nova-compute[70374]: usb Jul 27 09:26:07 user nova-compute[70374]: sata Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: virtio Jul 27 09:26:07 user nova-compute[70374]: virtio-transitional Jul 27 09:26:07 user nova-compute[70374]: virtio-non-transitional Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: sdl Jul 27 09:26:07 user nova-compute[70374]: vnc Jul 27 09:26:07 user nova-compute[70374]: spice Jul 27 09:26:07 user nova-compute[70374]: egl-headless Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: subsystem Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: default Jul 27 09:26:07 user nova-compute[70374]: mandatory Jul 27 09:26:07 user nova-compute[70374]: requisite Jul 27 09:26:07 user nova-compute[70374]: optional Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: usb Jul 27 09:26:07 user nova-compute[70374]: pci Jul 27 09:26:07 user nova-compute[70374]: scsi Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: virtio Jul 27 09:26:07 user nova-compute[70374]: virtio-transitional Jul 27 09:26:07 user nova-compute[70374]: virtio-non-transitional Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: random Jul 27 09:26:07 user nova-compute[70374]: egd Jul 27 09:26:07 user nova-compute[70374]: builtin Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: path Jul 27 09:26:07 user nova-compute[70374]: handle Jul 27 09:26:07 user nova-compute[70374]: virtiofs Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: tpm-tis Jul 27 09:26:07 user nova-compute[70374]: tpm-crb Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: passthrough Jul 27 09:26:07 user nova-compute[70374]: emulator Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: {{(pid=70374) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=ubuntu: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: /usr/bin/qemu-system-i386 Jul 27 09:26:07 user nova-compute[70374]: kvm Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-jammy Jul 27 09:26:07 user nova-compute[70374]: i686 Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: /usr/share/OVMF/OVMF_CODE.fd Jul 27 09:26:07 user nova-compute[70374]: /usr/share/OVMF/OVMF_CODE.secboot.fd Jul 27 09:26:07 user nova-compute[70374]: /usr/share/AAVMF/AAVMF_CODE.fd Jul 27 09:26:07 user nova-compute[70374]: /usr/share/AAVMF/AAVMF32_CODE.fd Jul 27 09:26:07 user nova-compute[70374]: /usr/share/OVMF/OVMF_CODE.ms.fd Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: rom Jul 27 09:26:07 user nova-compute[70374]: pflash Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: yes Jul 27 09:26:07 user nova-compute[70374]: no Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: no Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: off Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: on Jul 27 09:26:07 user nova-compute[70374]: off Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: IvyBridge-IBRS Jul 27 09:26:07 user nova-compute[70374]: Intel Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: qemu64 Jul 27 09:26:07 user nova-compute[70374]: qemu32 Jul 27 09:26:07 user nova-compute[70374]: phenom Jul 27 09:26:07 user nova-compute[70374]: pentium3 Jul 27 09:26:07 user nova-compute[70374]: pentium2 Jul 27 09:26:07 user nova-compute[70374]: pentium Jul 27 09:26:07 user nova-compute[70374]: n270 Jul 27 09:26:07 user nova-compute[70374]: kvm64 Jul 27 09:26:07 user nova-compute[70374]: kvm32 Jul 27 09:26:07 user nova-compute[70374]: coreduo Jul 27 09:26:07 user nova-compute[70374]: core2duo Jul 27 09:26:07 user nova-compute[70374]: athlon Jul 27 09:26:07 user nova-compute[70374]: Westmere-IBRS Jul 27 09:26:07 user nova-compute[70374]: Westmere Jul 27 09:26:07 user nova-compute[70374]: Snowridge Jul 27 09:26:07 user nova-compute[70374]: Skylake-Server-noTSX-IBRS Jul 27 09:26:07 user nova-compute[70374]: Skylake-Server-IBRS Jul 27 09:26:07 user nova-compute[70374]: Skylake-Server Jul 27 09:26:07 user nova-compute[70374]: Skylake-Client-noTSX-IBRS Jul 27 09:26:07 user nova-compute[70374]: Skylake-Client-IBRS Jul 27 09:26:07 user nova-compute[70374]: Skylake-Client Jul 27 09:26:07 user nova-compute[70374]: SandyBridge-IBRS Jul 27 09:26:07 user nova-compute[70374]: SandyBridge Jul 27 09:26:07 user nova-compute[70374]: Penryn Jul 27 09:26:07 user nova-compute[70374]: Opteron_G5 Jul 27 09:26:07 user nova-compute[70374]: Opteron_G4 Jul 27 09:26:07 user nova-compute[70374]: Opteron_G3 Jul 27 09:26:07 user nova-compute[70374]: Opteron_G2 Jul 27 09:26:07 user nova-compute[70374]: Opteron_G1 Jul 27 09:26:07 user nova-compute[70374]: Nehalem-IBRS Jul 27 09:26:07 user nova-compute[70374]: Nehalem Jul 27 09:26:07 user nova-compute[70374]: IvyBridge-IBRS Jul 27 09:26:07 user nova-compute[70374]: IvyBridge Jul 27 09:26:07 user nova-compute[70374]: Icelake-Server-noTSX Jul 27 09:26:07 user nova-compute[70374]: Icelake-Server Jul 27 09:26:07 user nova-compute[70374]: Icelake-Client-noTSX Jul 27 09:26:07 user nova-compute[70374]: Icelake-Client Jul 27 09:26:07 user nova-compute[70374]: Haswell-noTSX-IBRS Jul 27 09:26:07 user nova-compute[70374]: Haswell-noTSX Jul 27 09:26:07 user nova-compute[70374]: Haswell-IBRS Jul 27 09:26:07 user nova-compute[70374]: Haswell Jul 27 09:26:07 user nova-compute[70374]: EPYC-Rome Jul 27 09:26:07 user nova-compute[70374]: EPYC-Milan Jul 27 09:26:07 user nova-compute[70374]: EPYC-IBPB Jul 27 09:26:07 user nova-compute[70374]: EPYC Jul 27 09:26:07 user nova-compute[70374]: Dhyana Jul 27 09:26:07 user nova-compute[70374]: Cooperlake Jul 27 09:26:07 user nova-compute[70374]: Conroe Jul 27 09:26:07 user nova-compute[70374]: Cascadelake-Server-noTSX Jul 27 09:26:07 user nova-compute[70374]: Cascadelake-Server Jul 27 09:26:07 user nova-compute[70374]: Broadwell-noTSX-IBRS Jul 27 09:26:07 user nova-compute[70374]: Broadwell-noTSX Jul 27 09:26:07 user nova-compute[70374]: Broadwell-IBRS Jul 27 09:26:07 user nova-compute[70374]: Broadwell Jul 27 09:26:07 user nova-compute[70374]: 486 Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: file Jul 27 09:26:07 user nova-compute[70374]: anonymous Jul 27 09:26:07 user nova-compute[70374]: memfd Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: disk Jul 27 09:26:07 user nova-compute[70374]: cdrom Jul 27 09:26:07 user nova-compute[70374]: floppy Jul 27 09:26:07 user nova-compute[70374]: lun Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: ide Jul 27 09:26:07 user nova-compute[70374]: fdc Jul 27 09:26:07 user nova-compute[70374]: scsi Jul 27 09:26:07 user nova-compute[70374]: virtio Jul 27 09:26:07 user nova-compute[70374]: usb Jul 27 09:26:07 user nova-compute[70374]: sata Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: virtio Jul 27 09:26:07 user nova-compute[70374]: virtio-transitional Jul 27 09:26:07 user nova-compute[70374]: virtio-non-transitional Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: sdl Jul 27 09:26:07 user nova-compute[70374]: vnc Jul 27 09:26:07 user nova-compute[70374]: spice Jul 27 09:26:07 user nova-compute[70374]: egl-headless Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: subsystem Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: default Jul 27 09:26:07 user nova-compute[70374]: mandatory Jul 27 09:26:07 user nova-compute[70374]: requisite Jul 27 09:26:07 user nova-compute[70374]: optional Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: usb Jul 27 09:26:07 user nova-compute[70374]: pci Jul 27 09:26:07 user nova-compute[70374]: scsi Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: virtio Jul 27 09:26:07 user nova-compute[70374]: virtio-transitional Jul 27 09:26:07 user nova-compute[70374]: virtio-non-transitional Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: random Jul 27 09:26:07 user nova-compute[70374]: egd Jul 27 09:26:07 user nova-compute[70374]: builtin Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: path Jul 27 09:26:07 user nova-compute[70374]: handle Jul 27 09:26:07 user nova-compute[70374]: virtiofs Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: tpm-tis Jul 27 09:26:07 user nova-compute[70374]: tpm-crb Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: passthrough Jul 27 09:26:07 user nova-compute[70374]: emulator Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: {{(pid=70374) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Getting domain capabilities for m68k via machine types: {'virt', None} {{(pid=70374) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Error from libvirt when retrieving domain capabilities for arch m68k / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-m68k' on this host {{(pid=70374) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Error from libvirt when retrieving domain capabilities for arch m68k / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-m68k' on this host {{(pid=70374) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Getting domain capabilities for microblaze via machine types: {None} {{(pid=70374) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Error from libvirt when retrieving domain capabilities for arch microblaze / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-microblaze' on this host {{(pid=70374) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Getting domain capabilities for microblazeel via machine types: {None} {{(pid=70374) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Error from libvirt when retrieving domain capabilities for arch microblazeel / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-microblazeel' on this host {{(pid=70374) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Getting domain capabilities for mips via machine types: {None} {{(pid=70374) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Error from libvirt when retrieving domain capabilities for arch mips / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mips' on this host {{(pid=70374) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Getting domain capabilities for mipsel via machine types: {None} {{(pid=70374) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Error from libvirt when retrieving domain capabilities for arch mipsel / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mipsel' on this host {{(pid=70374) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Getting domain capabilities for mips64 via machine types: {None} {{(pid=70374) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Error from libvirt when retrieving domain capabilities for arch mips64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mips64' on this host {{(pid=70374) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Getting domain capabilities for mips64el via machine types: {None} {{(pid=70374) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Error from libvirt when retrieving domain capabilities for arch mips64el / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mips64el' on this host {{(pid=70374) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Getting domain capabilities for ppc via machine types: {None} {{(pid=70374) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Error from libvirt when retrieving domain capabilities for arch ppc / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc' on this host {{(pid=70374) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Getting domain capabilities for ppc64 via machine types: {'pseries', 'powernv', None} {{(pid=70374) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Error from libvirt when retrieving domain capabilities for arch ppc64 / virt_type kvm / machine_type pseries: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64' on this host {{(pid=70374) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Error from libvirt when retrieving domain capabilities for arch ppc64 / virt_type kvm / machine_type powernv: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64' on this host {{(pid=70374) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Error from libvirt when retrieving domain capabilities for arch ppc64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64' on this host {{(pid=70374) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Getting domain capabilities for ppc64le via machine types: {'pseries', 'powernv'} {{(pid=70374) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Error from libvirt when retrieving domain capabilities for arch ppc64le / virt_type kvm / machine_type pseries: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64le' on this host {{(pid=70374) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Error from libvirt when retrieving domain capabilities for arch ppc64le / virt_type kvm / machine_type powernv: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64le' on this host {{(pid=70374) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Getting domain capabilities for riscv32 via machine types: {None} {{(pid=70374) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Error from libvirt when retrieving domain capabilities for arch riscv32 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-riscv32' on this host {{(pid=70374) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Getting domain capabilities for riscv64 via machine types: {None} {{(pid=70374) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Error from libvirt when retrieving domain capabilities for arch riscv64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-riscv64' on this host {{(pid=70374) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Getting domain capabilities for s390x via machine types: {'s390-ccw-virtio'} {{(pid=70374) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Error from libvirt when retrieving domain capabilities for arch s390x / virt_type kvm / machine_type s390-ccw-virtio: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-s390x' on this host {{(pid=70374) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Getting domain capabilities for sh4 via machine types: {None} {{(pid=70374) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Error from libvirt when retrieving domain capabilities for arch sh4 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sh4' on this host {{(pid=70374) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Getting domain capabilities for sh4eb via machine types: {None} {{(pid=70374) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Error from libvirt when retrieving domain capabilities for arch sh4eb / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sh4eb' on this host {{(pid=70374) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Getting domain capabilities for sparc via machine types: {None} {{(pid=70374) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Error from libvirt when retrieving domain capabilities for arch sparc / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sparc' on this host {{(pid=70374) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Getting domain capabilities for sparc64 via machine types: {None} {{(pid=70374) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Error from libvirt when retrieving domain capabilities for arch sparc64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sparc64' on this host {{(pid=70374) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Getting domain capabilities for x86_64 via machine types: {'ubuntu-q35', 'pc', 'q35', 'ubuntu'} {{(pid=70374) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=ubuntu-q35: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: /usr/bin/qemu-system-x86_64 Jul 27 09:26:07 user nova-compute[70374]: kvm Jul 27 09:26:07 user nova-compute[70374]: pc-q35-jammy Jul 27 09:26:07 user nova-compute[70374]: x86_64 Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: efi Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: /usr/share/OVMF/OVMF_CODE_4M.ms.fd Jul 27 09:26:07 user nova-compute[70374]: /usr/share/OVMF/OVMF_CODE_4M.secboot.fd Jul 27 09:26:07 user nova-compute[70374]: /usr/share/OVMF/OVMF_CODE_4M.fd Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: rom Jul 27 09:26:07 user nova-compute[70374]: pflash Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: yes Jul 27 09:26:07 user nova-compute[70374]: no Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: yes Jul 27 09:26:07 user nova-compute[70374]: no Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: on Jul 27 09:26:07 user nova-compute[70374]: off Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: on Jul 27 09:26:07 user nova-compute[70374]: off Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: IvyBridge-IBRS Jul 27 09:26:07 user nova-compute[70374]: Intel Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: qemu64 Jul 27 09:26:07 user nova-compute[70374]: qemu32 Jul 27 09:26:07 user nova-compute[70374]: phenom Jul 27 09:26:07 user nova-compute[70374]: pentium3 Jul 27 09:26:07 user nova-compute[70374]: pentium2 Jul 27 09:26:07 user nova-compute[70374]: pentium Jul 27 09:26:07 user nova-compute[70374]: n270 Jul 27 09:26:07 user nova-compute[70374]: kvm64 Jul 27 09:26:07 user nova-compute[70374]: kvm32 Jul 27 09:26:07 user nova-compute[70374]: coreduo Jul 27 09:26:07 user nova-compute[70374]: core2duo Jul 27 09:26:07 user nova-compute[70374]: athlon Jul 27 09:26:07 user nova-compute[70374]: Westmere-IBRS Jul 27 09:26:07 user nova-compute[70374]: Westmere Jul 27 09:26:07 user nova-compute[70374]: Snowridge Jul 27 09:26:07 user nova-compute[70374]: Skylake-Server-noTSX-IBRS Jul 27 09:26:07 user nova-compute[70374]: Skylake-Server-IBRS Jul 27 09:26:07 user nova-compute[70374]: Skylake-Server Jul 27 09:26:07 user nova-compute[70374]: Skylake-Client-noTSX-IBRS Jul 27 09:26:07 user nova-compute[70374]: Skylake-Client-IBRS Jul 27 09:26:07 user nova-compute[70374]: Skylake-Client Jul 27 09:26:07 user nova-compute[70374]: SandyBridge-IBRS Jul 27 09:26:07 user nova-compute[70374]: SandyBridge Jul 27 09:26:07 user nova-compute[70374]: Penryn Jul 27 09:26:07 user nova-compute[70374]: Opteron_G5 Jul 27 09:26:07 user nova-compute[70374]: Opteron_G4 Jul 27 09:26:07 user nova-compute[70374]: Opteron_G3 Jul 27 09:26:07 user nova-compute[70374]: Opteron_G2 Jul 27 09:26:07 user nova-compute[70374]: Opteron_G1 Jul 27 09:26:07 user nova-compute[70374]: Nehalem-IBRS Jul 27 09:26:07 user nova-compute[70374]: Nehalem Jul 27 09:26:07 user nova-compute[70374]: IvyBridge-IBRS Jul 27 09:26:07 user nova-compute[70374]: IvyBridge Jul 27 09:26:07 user nova-compute[70374]: Icelake-Server-noTSX Jul 27 09:26:07 user nova-compute[70374]: Icelake-Server Jul 27 09:26:07 user nova-compute[70374]: Icelake-Client-noTSX Jul 27 09:26:07 user nova-compute[70374]: Icelake-Client Jul 27 09:26:07 user nova-compute[70374]: Haswell-noTSX-IBRS Jul 27 09:26:07 user nova-compute[70374]: Haswell-noTSX Jul 27 09:26:07 user nova-compute[70374]: Haswell-IBRS Jul 27 09:26:07 user nova-compute[70374]: Haswell Jul 27 09:26:07 user nova-compute[70374]: EPYC-Rome Jul 27 09:26:07 user nova-compute[70374]: EPYC-Milan Jul 27 09:26:07 user nova-compute[70374]: EPYC-IBPB Jul 27 09:26:07 user nova-compute[70374]: EPYC Jul 27 09:26:07 user nova-compute[70374]: Dhyana Jul 27 09:26:07 user nova-compute[70374]: Cooperlake Jul 27 09:26:07 user nova-compute[70374]: Conroe Jul 27 09:26:07 user nova-compute[70374]: Cascadelake-Server-noTSX Jul 27 09:26:07 user nova-compute[70374]: Cascadelake-Server Jul 27 09:26:07 user nova-compute[70374]: Broadwell-noTSX-IBRS Jul 27 09:26:07 user nova-compute[70374]: Broadwell-noTSX Jul 27 09:26:07 user nova-compute[70374]: Broadwell-IBRS Jul 27 09:26:07 user nova-compute[70374]: Broadwell Jul 27 09:26:07 user nova-compute[70374]: 486 Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: file Jul 27 09:26:07 user nova-compute[70374]: anonymous Jul 27 09:26:07 user nova-compute[70374]: memfd Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: disk Jul 27 09:26:07 user nova-compute[70374]: cdrom Jul 27 09:26:07 user nova-compute[70374]: floppy Jul 27 09:26:07 user nova-compute[70374]: lun Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: fdc Jul 27 09:26:07 user nova-compute[70374]: scsi Jul 27 09:26:07 user nova-compute[70374]: virtio Jul 27 09:26:07 user nova-compute[70374]: usb Jul 27 09:26:07 user nova-compute[70374]: sata Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: virtio Jul 27 09:26:07 user nova-compute[70374]: virtio-transitional Jul 27 09:26:07 user nova-compute[70374]: virtio-non-transitional Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: sdl Jul 27 09:26:07 user nova-compute[70374]: vnc Jul 27 09:26:07 user nova-compute[70374]: spice Jul 27 09:26:07 user nova-compute[70374]: egl-headless Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: subsystem Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: default Jul 27 09:26:07 user nova-compute[70374]: mandatory Jul 27 09:26:07 user nova-compute[70374]: requisite Jul 27 09:26:07 user nova-compute[70374]: optional Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: usb Jul 27 09:26:07 user nova-compute[70374]: pci Jul 27 09:26:07 user nova-compute[70374]: scsi Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: virtio Jul 27 09:26:07 user nova-compute[70374]: virtio-transitional Jul 27 09:26:07 user nova-compute[70374]: virtio-non-transitional Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: random Jul 27 09:26:07 user nova-compute[70374]: egd Jul 27 09:26:07 user nova-compute[70374]: builtin Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: path Jul 27 09:26:07 user nova-compute[70374]: handle Jul 27 09:26:07 user nova-compute[70374]: virtiofs Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: tpm-tis Jul 27 09:26:07 user nova-compute[70374]: tpm-crb Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: passthrough Jul 27 09:26:07 user nova-compute[70374]: emulator Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: {{(pid=70374) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: /usr/bin/qemu-system-x86_64 Jul 27 09:26:07 user nova-compute[70374]: kvm Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-6.2 Jul 27 09:26:07 user nova-compute[70374]: x86_64 Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: efi Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: /usr/share/OVMF/OVMF_CODE_4M.fd Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: rom Jul 27 09:26:07 user nova-compute[70374]: pflash Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: yes Jul 27 09:26:07 user nova-compute[70374]: no Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: no Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: on Jul 27 09:26:07 user nova-compute[70374]: off Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: on Jul 27 09:26:07 user nova-compute[70374]: off Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: IvyBridge-IBRS Jul 27 09:26:07 user nova-compute[70374]: Intel Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: qemu64 Jul 27 09:26:07 user nova-compute[70374]: qemu32 Jul 27 09:26:07 user nova-compute[70374]: phenom Jul 27 09:26:07 user nova-compute[70374]: pentium3 Jul 27 09:26:07 user nova-compute[70374]: pentium2 Jul 27 09:26:07 user nova-compute[70374]: pentium Jul 27 09:26:07 user nova-compute[70374]: n270 Jul 27 09:26:07 user nova-compute[70374]: kvm64 Jul 27 09:26:07 user nova-compute[70374]: kvm32 Jul 27 09:26:07 user nova-compute[70374]: coreduo Jul 27 09:26:07 user nova-compute[70374]: core2duo Jul 27 09:26:07 user nova-compute[70374]: athlon Jul 27 09:26:07 user nova-compute[70374]: Westmere-IBRS Jul 27 09:26:07 user nova-compute[70374]: Westmere Jul 27 09:26:07 user nova-compute[70374]: Snowridge Jul 27 09:26:07 user nova-compute[70374]: Skylake-Server-noTSX-IBRS Jul 27 09:26:07 user nova-compute[70374]: Skylake-Server-IBRS Jul 27 09:26:07 user nova-compute[70374]: Skylake-Server Jul 27 09:26:07 user nova-compute[70374]: Skylake-Client-noTSX-IBRS Jul 27 09:26:07 user nova-compute[70374]: Skylake-Client-IBRS Jul 27 09:26:07 user nova-compute[70374]: Skylake-Client Jul 27 09:26:07 user nova-compute[70374]: SandyBridge-IBRS Jul 27 09:26:07 user nova-compute[70374]: SandyBridge Jul 27 09:26:07 user nova-compute[70374]: Penryn Jul 27 09:26:07 user nova-compute[70374]: Opteron_G5 Jul 27 09:26:07 user nova-compute[70374]: Opteron_G4 Jul 27 09:26:07 user nova-compute[70374]: Opteron_G3 Jul 27 09:26:07 user nova-compute[70374]: Opteron_G2 Jul 27 09:26:07 user nova-compute[70374]: Opteron_G1 Jul 27 09:26:07 user nova-compute[70374]: Nehalem-IBRS Jul 27 09:26:07 user nova-compute[70374]: Nehalem Jul 27 09:26:07 user nova-compute[70374]: IvyBridge-IBRS Jul 27 09:26:07 user nova-compute[70374]: IvyBridge Jul 27 09:26:07 user nova-compute[70374]: Icelake-Server-noTSX Jul 27 09:26:07 user nova-compute[70374]: Icelake-Server Jul 27 09:26:07 user nova-compute[70374]: Icelake-Client-noTSX Jul 27 09:26:07 user nova-compute[70374]: Icelake-Client Jul 27 09:26:07 user nova-compute[70374]: Haswell-noTSX-IBRS Jul 27 09:26:07 user nova-compute[70374]: Haswell-noTSX Jul 27 09:26:07 user nova-compute[70374]: Haswell-IBRS Jul 27 09:26:07 user nova-compute[70374]: Haswell Jul 27 09:26:07 user nova-compute[70374]: EPYC-Rome Jul 27 09:26:07 user nova-compute[70374]: EPYC-Milan Jul 27 09:26:07 user nova-compute[70374]: EPYC-IBPB Jul 27 09:26:07 user nova-compute[70374]: EPYC Jul 27 09:26:07 user nova-compute[70374]: Dhyana Jul 27 09:26:07 user nova-compute[70374]: Cooperlake Jul 27 09:26:07 user nova-compute[70374]: Conroe Jul 27 09:26:07 user nova-compute[70374]: Cascadelake-Server-noTSX Jul 27 09:26:07 user nova-compute[70374]: Cascadelake-Server Jul 27 09:26:07 user nova-compute[70374]: Broadwell-noTSX-IBRS Jul 27 09:26:07 user nova-compute[70374]: Broadwell-noTSX Jul 27 09:26:07 user nova-compute[70374]: Broadwell-IBRS Jul 27 09:26:07 user nova-compute[70374]: Broadwell Jul 27 09:26:07 user nova-compute[70374]: 486 Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: file Jul 27 09:26:07 user nova-compute[70374]: anonymous Jul 27 09:26:07 user nova-compute[70374]: memfd Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: disk Jul 27 09:26:07 user nova-compute[70374]: cdrom Jul 27 09:26:07 user nova-compute[70374]: floppy Jul 27 09:26:07 user nova-compute[70374]: lun Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: ide Jul 27 09:26:07 user nova-compute[70374]: fdc Jul 27 09:26:07 user nova-compute[70374]: scsi Jul 27 09:26:07 user nova-compute[70374]: virtio Jul 27 09:26:07 user nova-compute[70374]: usb Jul 27 09:26:07 user nova-compute[70374]: sata Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: virtio Jul 27 09:26:07 user nova-compute[70374]: virtio-transitional Jul 27 09:26:07 user nova-compute[70374]: virtio-non-transitional Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: sdl Jul 27 09:26:07 user nova-compute[70374]: vnc Jul 27 09:26:07 user nova-compute[70374]: spice Jul 27 09:26:07 user nova-compute[70374]: egl-headless Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: subsystem Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: default Jul 27 09:26:07 user nova-compute[70374]: mandatory Jul 27 09:26:07 user nova-compute[70374]: requisite Jul 27 09:26:07 user nova-compute[70374]: optional Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: usb Jul 27 09:26:07 user nova-compute[70374]: pci Jul 27 09:26:07 user nova-compute[70374]: scsi Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: virtio Jul 27 09:26:07 user nova-compute[70374]: virtio-transitional Jul 27 09:26:07 user nova-compute[70374]: virtio-non-transitional Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: random Jul 27 09:26:07 user nova-compute[70374]: egd Jul 27 09:26:07 user nova-compute[70374]: builtin Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: path Jul 27 09:26:07 user nova-compute[70374]: handle Jul 27 09:26:07 user nova-compute[70374]: virtiofs Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: tpm-tis Jul 27 09:26:07 user nova-compute[70374]: tpm-crb Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: passthrough Jul 27 09:26:07 user nova-compute[70374]: emulator Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: {{(pid=70374) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: /usr/bin/qemu-system-x86_64 Jul 27 09:26:07 user nova-compute[70374]: kvm Jul 27 09:26:07 user nova-compute[70374]: pc-q35-6.2 Jul 27 09:26:07 user nova-compute[70374]: x86_64 Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: efi Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: /usr/share/OVMF/OVMF_CODE_4M.ms.fd Jul 27 09:26:07 user nova-compute[70374]: /usr/share/OVMF/OVMF_CODE_4M.secboot.fd Jul 27 09:26:07 user nova-compute[70374]: /usr/share/OVMF/OVMF_CODE_4M.fd Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: rom Jul 27 09:26:07 user nova-compute[70374]: pflash Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: yes Jul 27 09:26:07 user nova-compute[70374]: no Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: yes Jul 27 09:26:07 user nova-compute[70374]: no Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: on Jul 27 09:26:07 user nova-compute[70374]: off Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: on Jul 27 09:26:07 user nova-compute[70374]: off Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: IvyBridge-IBRS Jul 27 09:26:07 user nova-compute[70374]: Intel Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: qemu64 Jul 27 09:26:07 user nova-compute[70374]: qemu32 Jul 27 09:26:07 user nova-compute[70374]: phenom Jul 27 09:26:07 user nova-compute[70374]: pentium3 Jul 27 09:26:07 user nova-compute[70374]: pentium2 Jul 27 09:26:07 user nova-compute[70374]: pentium Jul 27 09:26:07 user nova-compute[70374]: n270 Jul 27 09:26:07 user nova-compute[70374]: kvm64 Jul 27 09:26:07 user nova-compute[70374]: kvm32 Jul 27 09:26:07 user nova-compute[70374]: coreduo Jul 27 09:26:07 user nova-compute[70374]: core2duo Jul 27 09:26:07 user nova-compute[70374]: athlon Jul 27 09:26:07 user nova-compute[70374]: Westmere-IBRS Jul 27 09:26:07 user nova-compute[70374]: Westmere Jul 27 09:26:07 user nova-compute[70374]: Snowridge Jul 27 09:26:07 user nova-compute[70374]: Skylake-Server-noTSX-IBRS Jul 27 09:26:07 user nova-compute[70374]: Skylake-Server-IBRS Jul 27 09:26:07 user nova-compute[70374]: Skylake-Server Jul 27 09:26:07 user nova-compute[70374]: Skylake-Client-noTSX-IBRS Jul 27 09:26:07 user nova-compute[70374]: Skylake-Client-IBRS Jul 27 09:26:07 user nova-compute[70374]: Skylake-Client Jul 27 09:26:07 user nova-compute[70374]: SandyBridge-IBRS Jul 27 09:26:07 user nova-compute[70374]: SandyBridge Jul 27 09:26:07 user nova-compute[70374]: Penryn Jul 27 09:26:07 user nova-compute[70374]: Opteron_G5 Jul 27 09:26:07 user nova-compute[70374]: Opteron_G4 Jul 27 09:26:07 user nova-compute[70374]: Opteron_G3 Jul 27 09:26:07 user nova-compute[70374]: Opteron_G2 Jul 27 09:26:07 user nova-compute[70374]: Opteron_G1 Jul 27 09:26:07 user nova-compute[70374]: Nehalem-IBRS Jul 27 09:26:07 user nova-compute[70374]: Nehalem Jul 27 09:26:07 user nova-compute[70374]: IvyBridge-IBRS Jul 27 09:26:07 user nova-compute[70374]: IvyBridge Jul 27 09:26:07 user nova-compute[70374]: Icelake-Server-noTSX Jul 27 09:26:07 user nova-compute[70374]: Icelake-Server Jul 27 09:26:07 user nova-compute[70374]: Icelake-Client-noTSX Jul 27 09:26:07 user nova-compute[70374]: Icelake-Client Jul 27 09:26:07 user nova-compute[70374]: Haswell-noTSX-IBRS Jul 27 09:26:07 user nova-compute[70374]: Haswell-noTSX Jul 27 09:26:07 user nova-compute[70374]: Haswell-IBRS Jul 27 09:26:07 user nova-compute[70374]: Haswell Jul 27 09:26:07 user nova-compute[70374]: EPYC-Rome Jul 27 09:26:07 user nova-compute[70374]: EPYC-Milan Jul 27 09:26:07 user nova-compute[70374]: EPYC-IBPB Jul 27 09:26:07 user nova-compute[70374]: EPYC Jul 27 09:26:07 user nova-compute[70374]: Dhyana Jul 27 09:26:07 user nova-compute[70374]: Cooperlake Jul 27 09:26:07 user nova-compute[70374]: Conroe Jul 27 09:26:07 user nova-compute[70374]: Cascadelake-Server-noTSX Jul 27 09:26:07 user nova-compute[70374]: Cascadelake-Server Jul 27 09:26:07 user nova-compute[70374]: Broadwell-noTSX-IBRS Jul 27 09:26:07 user nova-compute[70374]: Broadwell-noTSX Jul 27 09:26:07 user nova-compute[70374]: Broadwell-IBRS Jul 27 09:26:07 user nova-compute[70374]: Broadwell Jul 27 09:26:07 user nova-compute[70374]: 486 Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: file Jul 27 09:26:07 user nova-compute[70374]: anonymous Jul 27 09:26:07 user nova-compute[70374]: memfd Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: disk Jul 27 09:26:07 user nova-compute[70374]: cdrom Jul 27 09:26:07 user nova-compute[70374]: floppy Jul 27 09:26:07 user nova-compute[70374]: lun Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: fdc Jul 27 09:26:07 user nova-compute[70374]: scsi Jul 27 09:26:07 user nova-compute[70374]: virtio Jul 27 09:26:07 user nova-compute[70374]: usb Jul 27 09:26:07 user nova-compute[70374]: sata Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: virtio Jul 27 09:26:07 user nova-compute[70374]: virtio-transitional Jul 27 09:26:07 user nova-compute[70374]: virtio-non-transitional Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: sdl Jul 27 09:26:07 user nova-compute[70374]: vnc Jul 27 09:26:07 user nova-compute[70374]: spice Jul 27 09:26:07 user nova-compute[70374]: egl-headless Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: subsystem Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: default Jul 27 09:26:07 user nova-compute[70374]: mandatory Jul 27 09:26:07 user nova-compute[70374]: requisite Jul 27 09:26:07 user nova-compute[70374]: optional Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: usb Jul 27 09:26:07 user nova-compute[70374]: pci Jul 27 09:26:07 user nova-compute[70374]: scsi Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: virtio Jul 27 09:26:07 user nova-compute[70374]: virtio-transitional Jul 27 09:26:07 user nova-compute[70374]: virtio-non-transitional Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: random Jul 27 09:26:07 user nova-compute[70374]: egd Jul 27 09:26:07 user nova-compute[70374]: builtin Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: path Jul 27 09:26:07 user nova-compute[70374]: handle Jul 27 09:26:07 user nova-compute[70374]: virtiofs Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: tpm-tis Jul 27 09:26:07 user nova-compute[70374]: tpm-crb Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: passthrough Jul 27 09:26:07 user nova-compute[70374]: emulator Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: {{(pid=70374) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=ubuntu: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: /usr/bin/qemu-system-x86_64 Jul 27 09:26:07 user nova-compute[70374]: kvm Jul 27 09:26:07 user nova-compute[70374]: pc-i440fx-jammy Jul 27 09:26:07 user nova-compute[70374]: x86_64 Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: efi Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: /usr/share/OVMF/OVMF_CODE_4M.fd Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: rom Jul 27 09:26:07 user nova-compute[70374]: pflash Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: yes Jul 27 09:26:07 user nova-compute[70374]: no Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: no Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: on Jul 27 09:26:07 user nova-compute[70374]: off Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: on Jul 27 09:26:07 user nova-compute[70374]: off Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: IvyBridge-IBRS Jul 27 09:26:07 user nova-compute[70374]: Intel Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: qemu64 Jul 27 09:26:07 user nova-compute[70374]: qemu32 Jul 27 09:26:07 user nova-compute[70374]: phenom Jul 27 09:26:07 user nova-compute[70374]: pentium3 Jul 27 09:26:07 user nova-compute[70374]: pentium2 Jul 27 09:26:07 user nova-compute[70374]: pentium Jul 27 09:26:07 user nova-compute[70374]: n270 Jul 27 09:26:07 user nova-compute[70374]: kvm64 Jul 27 09:26:07 user nova-compute[70374]: kvm32 Jul 27 09:26:07 user nova-compute[70374]: coreduo Jul 27 09:26:07 user nova-compute[70374]: core2duo Jul 27 09:26:07 user nova-compute[70374]: athlon Jul 27 09:26:07 user nova-compute[70374]: Westmere-IBRS Jul 27 09:26:07 user nova-compute[70374]: Westmere Jul 27 09:26:07 user nova-compute[70374]: Snowridge Jul 27 09:26:07 user nova-compute[70374]: Skylake-Server-noTSX-IBRS Jul 27 09:26:07 user nova-compute[70374]: Skylake-Server-IBRS Jul 27 09:26:07 user nova-compute[70374]: Skylake-Server Jul 27 09:26:07 user nova-compute[70374]: Skylake-Client-noTSX-IBRS Jul 27 09:26:07 user nova-compute[70374]: Skylake-Client-IBRS Jul 27 09:26:07 user nova-compute[70374]: Skylake-Client Jul 27 09:26:07 user nova-compute[70374]: SandyBridge-IBRS Jul 27 09:26:07 user nova-compute[70374]: SandyBridge Jul 27 09:26:07 user nova-compute[70374]: Penryn Jul 27 09:26:07 user nova-compute[70374]: Opteron_G5 Jul 27 09:26:07 user nova-compute[70374]: Opteron_G4 Jul 27 09:26:07 user nova-compute[70374]: Opteron_G3 Jul 27 09:26:07 user nova-compute[70374]: Opteron_G2 Jul 27 09:26:07 user nova-compute[70374]: Opteron_G1 Jul 27 09:26:07 user nova-compute[70374]: Nehalem-IBRS Jul 27 09:26:07 user nova-compute[70374]: Nehalem Jul 27 09:26:07 user nova-compute[70374]: IvyBridge-IBRS Jul 27 09:26:07 user nova-compute[70374]: IvyBridge Jul 27 09:26:07 user nova-compute[70374]: Icelake-Server-noTSX Jul 27 09:26:07 user nova-compute[70374]: Icelake-Server Jul 27 09:26:07 user nova-compute[70374]: Icelake-Client-noTSX Jul 27 09:26:07 user nova-compute[70374]: Icelake-Client Jul 27 09:26:07 user nova-compute[70374]: Haswell-noTSX-IBRS Jul 27 09:26:07 user nova-compute[70374]: Haswell-noTSX Jul 27 09:26:07 user nova-compute[70374]: Haswell-IBRS Jul 27 09:26:07 user nova-compute[70374]: Haswell Jul 27 09:26:07 user nova-compute[70374]: EPYC-Rome Jul 27 09:26:07 user nova-compute[70374]: EPYC-Milan Jul 27 09:26:07 user nova-compute[70374]: EPYC-IBPB Jul 27 09:26:07 user nova-compute[70374]: EPYC Jul 27 09:26:07 user nova-compute[70374]: Dhyana Jul 27 09:26:07 user nova-compute[70374]: Cooperlake Jul 27 09:26:07 user nova-compute[70374]: Conroe Jul 27 09:26:07 user nova-compute[70374]: Cascadelake-Server-noTSX Jul 27 09:26:07 user nova-compute[70374]: Cascadelake-Server Jul 27 09:26:07 user nova-compute[70374]: Broadwell-noTSX-IBRS Jul 27 09:26:07 user nova-compute[70374]: Broadwell-noTSX Jul 27 09:26:07 user nova-compute[70374]: Broadwell-IBRS Jul 27 09:26:07 user nova-compute[70374]: Broadwell Jul 27 09:26:07 user nova-compute[70374]: 486 Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: file Jul 27 09:26:07 user nova-compute[70374]: anonymous Jul 27 09:26:07 user nova-compute[70374]: memfd Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: disk Jul 27 09:26:07 user nova-compute[70374]: cdrom Jul 27 09:26:07 user nova-compute[70374]: floppy Jul 27 09:26:07 user nova-compute[70374]: lun Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: ide Jul 27 09:26:07 user nova-compute[70374]: fdc Jul 27 09:26:07 user nova-compute[70374]: scsi Jul 27 09:26:07 user nova-compute[70374]: virtio Jul 27 09:26:07 user nova-compute[70374]: usb Jul 27 09:26:07 user nova-compute[70374]: sata Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: virtio Jul 27 09:26:07 user nova-compute[70374]: virtio-transitional Jul 27 09:26:07 user nova-compute[70374]: virtio-non-transitional Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: sdl Jul 27 09:26:07 user nova-compute[70374]: vnc Jul 27 09:26:07 user nova-compute[70374]: spice Jul 27 09:26:07 user nova-compute[70374]: egl-headless Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: subsystem Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: default Jul 27 09:26:07 user nova-compute[70374]: mandatory Jul 27 09:26:07 user nova-compute[70374]: requisite Jul 27 09:26:07 user nova-compute[70374]: optional Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: usb Jul 27 09:26:07 user nova-compute[70374]: pci Jul 27 09:26:07 user nova-compute[70374]: scsi Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: virtio Jul 27 09:26:07 user nova-compute[70374]: virtio-transitional Jul 27 09:26:07 user nova-compute[70374]: virtio-non-transitional Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: random Jul 27 09:26:07 user nova-compute[70374]: egd Jul 27 09:26:07 user nova-compute[70374]: builtin Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: path Jul 27 09:26:07 user nova-compute[70374]: handle Jul 27 09:26:07 user nova-compute[70374]: virtiofs Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: tpm-tis Jul 27 09:26:07 user nova-compute[70374]: tpm-crb Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: passthrough Jul 27 09:26:07 user nova-compute[70374]: emulator Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: {{(pid=70374) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Getting domain capabilities for xtensa via machine types: {None} {{(pid=70374) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Error from libvirt when retrieving domain capabilities for arch xtensa / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-xtensa' on this host {{(pid=70374) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Getting domain capabilities for xtensaeb via machine types: {None} {{(pid=70374) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Error from libvirt when retrieving domain capabilities for arch xtensaeb / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-xtensaeb' on this host {{(pid=70374) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Checking secure boot support for host arch (x86_64) {{(pid=70374) supports_secure_boot /opt/stack/nova/nova/virt/libvirt/host.py:1779}} Jul 27 09:26:07 user nova-compute[70374]: INFO nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Secure Boot support detected Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] cpu compare xml: Jul 27 09:26:07 user nova-compute[70374]: Nehalem Jul 27 09:26:07 user nova-compute[70374]: Jul 27 09:26:07 user nova-compute[70374]: {{(pid=70374) _compare_cpu /opt/stack/nova/nova/virt/libvirt/driver.py:10023}} Jul 27 09:26:07 user nova-compute[70374]: INFO nova.virt.node [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Generated node identity f7548644-4a09-4ad8-9aa6-6e05d85a9f5b Jul 27 09:26:07 user nova-compute[70374]: INFO nova.virt.node [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Wrote node identity f7548644-4a09-4ad8-9aa6-6e05d85a9f5b to /opt/stack/data/nova/compute_id Jul 27 09:26:07 user nova-compute[70374]: WARNING nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Compute nodes ['f7548644-4a09-4ad8-9aa6-6e05d85a9f5b'] for host user were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. Jul 27 09:26:07 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host Jul 27 09:26:07 user nova-compute[70374]: WARNING nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] No compute node record found for host user. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host user could not be found. Jul 27 09:26:07 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Auditing locally available compute resources for user (node: user) {{(pid=70374) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} Jul 27 09:26:07 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:26:07 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Hypervisor/Node resource view: name=user free_ram=10932MB free_disk=26.33572769165039GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70374) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1080}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:26:07 user nova-compute[70374]: WARNING nova.compute.resource_tracker [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] No compute node record for user:f7548644-4a09-4ad8-9aa6-6e05d85a9f5b: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host f7548644-4a09-4ad8-9aa6-6e05d85a9f5b could not be found. Jul 27 09:26:07 user nova-compute[70374]: INFO nova.compute.resource_tracker [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Compute node record created for user:user with uuid: f7548644-4a09-4ad8-9aa6-6e05d85a9f5b Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=70374) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1103}} Jul 27 09:26:07 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Final resource view: name=user phys_ram=16011MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=70374) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1112}} Jul 27 09:26:08 user nova-compute[70374]: INFO nova.scheduler.client.report [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [req-44b37c6f-6944-449c-9293-bee6b180e7f2] Created resource provider record via placement API for resource provider with UUID f7548644-4a09-4ad8-9aa6-6e05d85a9f5b and name user. Jul 27 09:26:08 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] /sys/module/kvm_amd/parameters/sev does not exist {{(pid=70374) _kernel_supports_amd_sev /opt/stack/nova/nova/virt/libvirt/host.py:1795}} Jul 27 09:26:08 user nova-compute[70374]: INFO nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] kernel doesn't support AMD SEV Jul 27 09:26:08 user nova-compute[70374]: DEBUG nova.compute.provider_tree [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Updating inventory in ProviderTree for provider f7548644-4a09-4ad8-9aa6-6e05d85a9f5b with inventory: {'MEMORY_MB': {'total': 16011, 'min_unit': 1, 'max_unit': 16011, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 12, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 40, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 0}} {{(pid=70374) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Jul 27 09:26:08 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70374) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Jul 27 09:26:08 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Libvirt baseline CPU Jul 27 09:26:08 user nova-compute[70374]: x86_64 Jul 27 09:26:08 user nova-compute[70374]: Nehalem Jul 27 09:26:08 user nova-compute[70374]: Intel Jul 27 09:26:08 user nova-compute[70374]: Jul 27 09:26:08 user nova-compute[70374]: Jul 27 09:26:08 user nova-compute[70374]: {{(pid=70374) _get_guest_baseline_cpu_features /opt/stack/nova/nova/virt/libvirt/driver.py:12513}} Jul 27 09:26:08 user nova-compute[70374]: DEBUG nova.scheduler.client.report [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Updated inventory for provider f7548644-4a09-4ad8-9aa6-6e05d85a9f5b with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 16011, 'min_unit': 1, 'max_unit': 16011, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 12, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 40, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 0}} {{(pid=70374) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} Jul 27 09:26:08 user nova-compute[70374]: DEBUG nova.compute.provider_tree [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Updating resource provider f7548644-4a09-4ad8-9aa6-6e05d85a9f5b generation from 0 to 1 during operation: update_inventory {{(pid=70374) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} Jul 27 09:26:08 user nova-compute[70374]: DEBUG nova.compute.provider_tree [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Updating inventory in ProviderTree for provider f7548644-4a09-4ad8-9aa6-6e05d85a9f5b with inventory: {'MEMORY_MB': {'total': 16011, 'reserved': 512, 'min_unit': 1, 'max_unit': 16011, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70374) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Jul 27 09:26:08 user nova-compute[70374]: DEBUG nova.compute.provider_tree [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Updating resource provider f7548644-4a09-4ad8-9aa6-6e05d85a9f5b generation from 1 to 2 during operation: update_traits {{(pid=70374) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} Jul 27 09:26:08 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Compute_service record updated for user:user {{(pid=70374) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1041}} Jul 27 09:26:08 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.672s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:26:08 user nova-compute[70374]: DEBUG nova.service [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Creating RPC server for service compute {{(pid=70374) start /opt/stack/nova/nova/service.py:182}} Jul 27 09:26:08 user nova-compute[70374]: DEBUG nova.service [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Join ServiceGroup membership for this service compute {{(pid=70374) start /opt/stack/nova/nova/service.py:199}} Jul 27 09:26:08 user nova-compute[70374]: DEBUG nova.servicegroup.drivers.db [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] DB_Driver: join new ServiceGroup member user to the compute group, service = {{(pid=70374) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} Jul 27 09:26:22 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._sync_power_states {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:26:22 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:26:59 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:26:59 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:26:59 user nova-compute[70374]: DEBUG nova.compute.manager [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Starting heal instance info cache {{(pid=70374) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9842}} Jul 27 09:26:59 user nova-compute[70374]: DEBUG nova.compute.manager [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Rebuilding the list of instances to heal {{(pid=70374) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9846}} Jul 27 09:26:59 user nova-compute[70374]: DEBUG nova.compute.manager [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Didn't find any instances for network info cache update. {{(pid=70374) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9928}} Jul 27 09:26:59 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:26:59 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:26:59 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:26:59 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:26:59 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:26:59 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:26:59 user nova-compute[70374]: DEBUG nova.compute.manager [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70374) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10461}} Jul 27 09:26:59 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager.update_available_resource {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:26:59 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:26:59 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:26:59 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:26:59 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Auditing locally available compute resources for user (node: user) {{(pid=70374) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} Jul 27 09:26:59 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:26:59 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:26:59 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Hypervisor/Node resource view: name=user free_ram=10324MB free_disk=26.239673614501953GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70374) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1080}} Jul 27 09:26:59 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:26:59 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:26:59 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=70374) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1103}} Jul 27 09:26:59 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Final resource view: name=user phys_ram=16011MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=70374) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1112}} Jul 27 09:26:59 user nova-compute[70374]: DEBUG nova.compute.provider_tree [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Inventory has not changed in ProviderTree for provider: f7548644-4a09-4ad8-9aa6-6e05d85a9f5b {{(pid=70374) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Jul 27 09:26:59 user nova-compute[70374]: DEBUG nova.scheduler.client.report [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Inventory has not changed for provider f7548644-4a09-4ad8-9aa6-6e05d85a9f5b based on inventory data: {'MEMORY_MB': {'total': 16011, 'reserved': 512, 'min_unit': 1, 'max_unit': 16011, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70374) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Jul 27 09:26:59 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Compute_service record updated for user:user {{(pid=70374) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1041}} Jul 27 09:26:59 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.134s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:27:59 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:27:59 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager.update_available_resource {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:27:59 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:27:59 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:27:59 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:27:59 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Auditing locally available compute resources for user (node: user) {{(pid=70374) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} Jul 27 09:28:00 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:28:00 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:28:00 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Hypervisor/Node resource view: name=user free_ram=10282MB free_disk=26.284893035888672GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70374) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1080}} Jul 27 09:28:00 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:28:00 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:28:00 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=70374) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1103}} Jul 27 09:28:00 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Final resource view: name=user phys_ram=16011MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=70374) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1112}} Jul 27 09:28:00 user nova-compute[70374]: DEBUG nova.compute.provider_tree [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Inventory has not changed in ProviderTree for provider: f7548644-4a09-4ad8-9aa6-6e05d85a9f5b {{(pid=70374) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Jul 27 09:28:00 user nova-compute[70374]: DEBUG nova.scheduler.client.report [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Inventory has not changed for provider f7548644-4a09-4ad8-9aa6-6e05d85a9f5b based on inventory data: {'MEMORY_MB': {'total': 16011, 'reserved': 512, 'min_unit': 1, 'max_unit': 16011, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70374) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Jul 27 09:28:00 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Compute_service record updated for user:user {{(pid=70374) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1041}} Jul 27 09:28:00 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.136s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:28:00 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:28:00 user nova-compute[70374]: DEBUG nova.compute.manager [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Starting heal instance info cache {{(pid=70374) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9842}} Jul 27 09:28:00 user nova-compute[70374]: DEBUG nova.compute.manager [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Rebuilding the list of instances to heal {{(pid=70374) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9846}} Jul 27 09:28:00 user nova-compute[70374]: DEBUG nova.compute.manager [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Didn't find any instances for network info cache update. {{(pid=70374) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9928}} Jul 27 09:28:00 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:28:00 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:28:00 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:28:00 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:28:00 user nova-compute[70374]: DEBUG nova.compute.manager [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70374) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10461}} Jul 27 09:28:01 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:28:01 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:28:01 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:29:00 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:29:00 user nova-compute[70374]: DEBUG nova.compute.manager [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Starting heal instance info cache {{(pid=70374) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9842}} Jul 27 09:29:00 user nova-compute[70374]: DEBUG nova.compute.manager [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Rebuilding the list of instances to heal {{(pid=70374) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9846}} Jul 27 09:29:00 user nova-compute[70374]: DEBUG nova.compute.manager [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Didn't find any instances for network info cache update. {{(pid=70374) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9928}} Jul 27 09:29:00 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:29:01 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:29:01 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:29:01 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:29:01 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:29:01 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:29:01 user nova-compute[70374]: DEBUG nova.compute.manager [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70374) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10461}} Jul 27 09:29:01 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager.update_available_resource {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:29:01 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:29:01 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:29:01 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:29:01 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Auditing locally available compute resources for user (node: user) {{(pid=70374) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} Jul 27 09:29:01 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:29:01 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:29:01 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Hypervisor/Node resource view: name=user free_ram=10264MB free_disk=26.058773040771484GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70374) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1080}} Jul 27 09:29:01 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:29:01 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:29:01 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=70374) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1103}} Jul 27 09:29:01 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Final resource view: name=user phys_ram=16011MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=70374) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1112}} Jul 27 09:29:01 user nova-compute[70374]: DEBUG nova.compute.provider_tree [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Inventory has not changed in ProviderTree for provider: f7548644-4a09-4ad8-9aa6-6e05d85a9f5b {{(pid=70374) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Jul 27 09:29:01 user nova-compute[70374]: DEBUG nova.scheduler.client.report [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Inventory has not changed for provider f7548644-4a09-4ad8-9aa6-6e05d85a9f5b based on inventory data: {'MEMORY_MB': {'total': 16011, 'reserved': 512, 'min_unit': 1, 'max_unit': 16011, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70374) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Jul 27 09:29:01 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Compute_service record updated for user:user {{(pid=70374) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1041}} Jul 27 09:29:01 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.154s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:29:03 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:30:00 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:30:00 user nova-compute[70374]: DEBUG nova.compute.manager [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Starting heal instance info cache {{(pid=70374) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9842}} Jul 27 09:30:00 user nova-compute[70374]: DEBUG nova.compute.manager [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Rebuilding the list of instances to heal {{(pid=70374) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9846}} Jul 27 09:30:00 user nova-compute[70374]: DEBUG nova.compute.manager [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Didn't find any instances for network info cache update. {{(pid=70374) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9928}} Jul 27 09:30:01 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:30:01 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:30:01 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:30:01 user nova-compute[70374]: DEBUG nova.compute.manager [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70374) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10461}} Jul 27 09:30:02 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager.update_available_resource {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:30:02 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:02 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:02 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:02 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Auditing locally available compute resources for user (node: user) {{(pid=70374) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} Jul 27 09:30:02 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:30:02 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:30:02 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Hypervisor/Node resource view: name=user free_ram=9513MB free_disk=26.08094024658203GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70374) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1080}} Jul 27 09:30:02 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:02 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:02 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=70374) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1103}} Jul 27 09:30:02 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Final resource view: name=user phys_ram=16011MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=70374) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1112}} Jul 27 09:30:02 user nova-compute[70374]: DEBUG nova.compute.provider_tree [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Inventory has not changed in ProviderTree for provider: f7548644-4a09-4ad8-9aa6-6e05d85a9f5b {{(pid=70374) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Jul 27 09:30:02 user nova-compute[70374]: DEBUG nova.scheduler.client.report [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Inventory has not changed for provider f7548644-4a09-4ad8-9aa6-6e05d85a9f5b based on inventory data: {'MEMORY_MB': {'total': 16011, 'reserved': 512, 'min_unit': 1, 'max_unit': 16011, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70374) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Jul 27 09:30:02 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Compute_service record updated for user:user {{(pid=70374) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1041}} Jul 27 09:30:02 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.172s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:03 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:30:03 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:30:03 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:30:03 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:30:04 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:30:06 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Acquiring lock "04a990b9-ed32-41e9-b384-f0886e8d1b49" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:06 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Lock "04a990b9-ed32-41e9-b384-f0886e8d1b49" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:06 user nova-compute[70374]: DEBUG nova.compute.manager [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Starting instance... {{(pid=70374) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Jul 27 09:30:06 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:06 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:06 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70374) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Jul 27 09:30:06 user nova-compute[70374]: INFO nova.compute.claims [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Claim successful on node user Jul 27 09:30:07 user nova-compute[70374]: DEBUG nova.compute.provider_tree [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Inventory has not changed in ProviderTree for provider: f7548644-4a09-4ad8-9aa6-6e05d85a9f5b {{(pid=70374) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Jul 27 09:30:07 user nova-compute[70374]: DEBUG nova.scheduler.client.report [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Inventory has not changed for provider f7548644-4a09-4ad8-9aa6-6e05d85a9f5b based on inventory data: {'MEMORY_MB': {'total': 16011, 'reserved': 512, 'min_unit': 1, 'max_unit': 16011, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70374) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Jul 27 09:30:07 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.345s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:07 user nova-compute[70374]: DEBUG nova.compute.manager [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Start building networks asynchronously for instance. {{(pid=70374) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Jul 27 09:30:07 user nova-compute[70374]: DEBUG nova.compute.manager [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Allocating IP information in the background. {{(pid=70374) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} Jul 27 09:30:07 user nova-compute[70374]: DEBUG nova.network.neutron [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] allocate_for_instance() {{(pid=70374) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Jul 27 09:30:07 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Jul 27 09:30:07 user nova-compute[70374]: DEBUG nova.compute.manager [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Start building block device mappings for instance. {{(pid=70374) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Jul 27 09:30:07 user nova-compute[70374]: DEBUG nova.compute.manager [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Start spawning the instance on the hypervisor. {{(pid=70374) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Jul 27 09:30:07 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Creating instance directory {{(pid=70374) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Jul 27 09:30:07 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Creating image(s) Jul 27 09:30:07 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Acquiring lock "/opt/stack/data/nova/instances/04a990b9-ed32-41e9-b384-f0886e8d1b49/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:07 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Lock "/opt/stack/data/nova/instances/04a990b9-ed32-41e9-b384-f0886e8d1b49/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:07 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Lock "/opt/stack/data/nova/instances/04a990b9-ed32-41e9-b384-f0886e8d1b49/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:07 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Acquiring lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:07 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:07 user nova-compute[70374]: DEBUG nova.policy [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '544aa4a40ff941c2a52b24faa08a0d29', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '65e935bb36794aa98eb3e94c30e647d9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70374) authorize /opt/stack/nova/nova/policy.py:203}} Jul 27 09:30:08 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1.part --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:08 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1.part --force-share --output=json" returned: 0 in 0.146s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:08 user nova-compute[70374]: DEBUG nova.virt.images [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] 35458adf-261a-4e0b-a4db-b243619b2394 was qcow2, converting to raw {{(pid=70374) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Jul 27 09:30:08 user nova-compute[70374]: DEBUG nova.privsep.utils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=70374) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Jul 27 09:30:08 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1.part /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1.converted {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:08 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1.part /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1.converted" returned: 0 in 0.179s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:08 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1.converted --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:09 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1.converted --force-share --output=json" returned: 0 in 0.156s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:09 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 1.359s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:09 user nova-compute[70374]: INFO oslo.privsep.daemon [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova-cpu.conf', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpjix27fxf/privsep.sock'] Jul 27 09:30:09 user sudo[79508]: stack : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/nova-rootwrap /etc/nova/rootwrap.conf privsep-helper --config-file /etc/nova/nova-cpu.conf --privsep_context nova.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpjix27fxf/privsep.sock Jul 27 09:30:09 user sudo[79508]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1001) Jul 27 09:30:09 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Acquiring lock "fb5ccac9-1e45-4726-b681-cf34cf3fa521" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:09 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Lock "fb5ccac9-1e45-4726-b681-cf34cf3fa521" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:09 user nova-compute[70374]: DEBUG nova.compute.manager [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Starting instance... {{(pid=70374) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Jul 27 09:30:09 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:09 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:09 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70374) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Jul 27 09:30:09 user nova-compute[70374]: INFO nova.compute.claims [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Claim successful on node user Jul 27 09:30:10 user nova-compute[70374]: DEBUG nova.compute.provider_tree [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Inventory has not changed in ProviderTree for provider: f7548644-4a09-4ad8-9aa6-6e05d85a9f5b {{(pid=70374) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Jul 27 09:30:10 user nova-compute[70374]: DEBUG nova.scheduler.client.report [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Inventory has not changed for provider f7548644-4a09-4ad8-9aa6-6e05d85a9f5b based on inventory data: {'MEMORY_MB': {'total': 16011, 'reserved': 512, 'min_unit': 1, 'max_unit': 16011, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70374) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Jul 27 09:30:10 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.583s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:10 user nova-compute[70374]: DEBUG nova.compute.manager [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Start building networks asynchronously for instance. {{(pid=70374) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Jul 27 09:30:10 user nova-compute[70374]: DEBUG nova.compute.manager [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Allocating IP information in the background. {{(pid=70374) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} Jul 27 09:30:10 user nova-compute[70374]: DEBUG nova.network.neutron [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] allocate_for_instance() {{(pid=70374) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Jul 27 09:30:10 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Jul 27 09:30:10 user sudo[79508]: pam_unix(sudo:session): session closed for user root Jul 27 09:30:10 user nova-compute[70374]: INFO oslo.privsep.daemon [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Spawned new privsep daemon via rootwrap Jul 27 09:30:10 user nova-compute[70374]: INFO oslo.privsep.daemon [-] privsep daemon starting Jul 27 09:30:10 user nova-compute[70374]: INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0 Jul 27 09:30:10 user nova-compute[70374]: INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none Jul 27 09:30:10 user nova-compute[70374]: INFO oslo.privsep.daemon [-] privsep daemon running as pid 79521 Jul 27 09:30:10 user nova-compute[70374]: DEBUG nova.compute.manager [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Start building block device mappings for instance. {{(pid=70374) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Jul 27 09:30:10 user nova-compute[70374]: DEBUG nova.network.neutron [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Successfully created port: 6858719a-7fa4-4a64-ad8a-02c9274adb55 {{(pid=70374) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} Jul 27 09:30:10 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:10 user nova-compute[70374]: DEBUG nova.compute.manager [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Start spawning the instance on the hypervisor. {{(pid=70374) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Jul 27 09:30:10 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Creating instance directory {{(pid=70374) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Jul 27 09:30:10 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Creating image(s) Jul 27 09:30:10 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Acquiring lock "/opt/stack/data/nova/instances/fb5ccac9-1e45-4726-b681-cf34cf3fa521/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:10 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Lock "/opt/stack/data/nova/instances/fb5ccac9-1e45-4726-b681-cf34cf3fa521/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.003s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:10 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Lock "/opt/stack/data/nova/instances/fb5ccac9-1e45-4726-b681-cf34cf3fa521/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:10 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.159s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Acquiring lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.144s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Acquiring lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.150s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1,backing_fmt=raw /opt/stack/data/nova/instances/04a990b9-ed32-41e9-b384-f0886e8d1b49/disk 1073741824 {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1,backing_fmt=raw /opt/stack/data/nova/instances/04a990b9-ed32-41e9-b384-f0886e8d1b49/disk 1073741824" returned: 0 in 0.086s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.241s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.216s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG nova.policy [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd059c8dff3644f3c9c0c54498a4d78f7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd8acda82ef3f428fbb93847922a213d1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70374) authorize /opt/stack/nova/nova/policy.py:203}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.147s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1,backing_fmt=raw /opt/stack/data/nova/instances/fb5ccac9-1e45-4726-b681-cf34cf3fa521/disk 1073741824 {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.221s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG nova.virt.disk.api [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Checking if we can resize image /opt/stack/data/nova/instances/04a990b9-ed32-41e9-b384-f0886e8d1b49/disk. size=1073741824 {{(pid=70374) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/04a990b9-ed32-41e9-b384-f0886e8d1b49/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1,backing_fmt=raw /opt/stack/data/nova/instances/fb5ccac9-1e45-4726-b681-cf34cf3fa521/disk 1073741824" returned: 0 in 0.078s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.229s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.129s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG nova.virt.disk.api [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Checking if we can resize image /opt/stack/data/nova/instances/fb5ccac9-1e45-4726-b681-cf34cf3fa521/disk. size=1073741824 {{(pid=70374) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fb5ccac9-1e45-4726-b681-cf34cf3fa521/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/04a990b9-ed32-41e9-b384-f0886e8d1b49/disk --force-share --output=json" returned: 0 in 0.198s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG nova.virt.disk.api [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Cannot resize image /opt/stack/data/nova/instances/04a990b9-ed32-41e9-b384-f0886e8d1b49/disk to a smaller size. {{(pid=70374) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG nova.objects.instance [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Lazy-loading 'migration_context' on Instance uuid 04a990b9-ed32-41e9-b384-f0886e8d1b49 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Created local disks {{(pid=70374) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Ensure instance console log exists: /opt/stack/data/nova/instances/04a990b9-ed32-41e9-b384-f0886e8d1b49/console.log {{(pid=70374) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fb5ccac9-1e45-4726-b681-cf34cf3fa521/disk --force-share --output=json" returned: 0 in 0.152s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG nova.virt.disk.api [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Cannot resize image /opt/stack/data/nova/instances/fb5ccac9-1e45-4726-b681-cf34cf3fa521/disk to a smaller size. {{(pid=70374) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG nova.objects.instance [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Lazy-loading 'migration_context' on Instance uuid fb5ccac9-1e45-4726-b681-cf34cf3fa521 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Created local disks {{(pid=70374) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Ensure instance console log exists: /opt/stack/data/nova/instances/fb5ccac9-1e45-4726-b681-cf34cf3fa521/console.log {{(pid=70374) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:11 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:12 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Acquiring lock "6a5593cd-3ab6-4859-b0fb-33a0ed702dc8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:12 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Lock "6a5593cd-3ab6-4859-b0fb-33a0ed702dc8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:12 user nova-compute[70374]: DEBUG nova.compute.manager [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Starting instance... {{(pid=70374) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Jul 27 09:30:12 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Acquiring lock "6ae93f34-ce7e-4ae4-a5ba-36508361bd54" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:12 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Lock "6ae93f34-ce7e-4ae4-a5ba-36508361bd54" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:12 user nova-compute[70374]: DEBUG nova.compute.manager [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Starting instance... {{(pid=70374) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Jul 27 09:30:12 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:12 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:12 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70374) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Jul 27 09:30:12 user nova-compute[70374]: INFO nova.compute.claims [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Claim successful on node user Jul 27 09:30:12 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:12 user nova-compute[70374]: DEBUG nova.compute.provider_tree [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Inventory has not changed in ProviderTree for provider: f7548644-4a09-4ad8-9aa6-6e05d85a9f5b {{(pid=70374) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Jul 27 09:30:12 user nova-compute[70374]: DEBUG nova.scheduler.client.report [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Inventory has not changed for provider f7548644-4a09-4ad8-9aa6-6e05d85a9f5b based on inventory data: {'MEMORY_MB': {'total': 16011, 'reserved': 512, 'min_unit': 1, 'max_unit': 16011, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70374) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Jul 27 09:30:12 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.380s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:12 user nova-compute[70374]: DEBUG nova.compute.manager [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Start building networks asynchronously for instance. {{(pid=70374) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Jul 27 09:30:12 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.311s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:12 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70374) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Jul 27 09:30:12 user nova-compute[70374]: INFO nova.compute.claims [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Claim successful on node user Jul 27 09:30:12 user nova-compute[70374]: DEBUG nova.compute.manager [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Allocating IP information in the background. {{(pid=70374) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} Jul 27 09:30:12 user nova-compute[70374]: DEBUG nova.network.neutron [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] allocate_for_instance() {{(pid=70374) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Jul 27 09:30:12 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Jul 27 09:30:12 user nova-compute[70374]: DEBUG nova.compute.manager [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Start building block device mappings for instance. {{(pid=70374) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Jul 27 09:30:12 user nova-compute[70374]: DEBUG nova.compute.provider_tree [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Inventory has not changed in ProviderTree for provider: f7548644-4a09-4ad8-9aa6-6e05d85a9f5b {{(pid=70374) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Jul 27 09:30:12 user nova-compute[70374]: DEBUG nova.compute.manager [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Start spawning the instance on the hypervisor. {{(pid=70374) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Jul 27 09:30:12 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Creating instance directory {{(pid=70374) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Jul 27 09:30:12 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Creating image(s) Jul 27 09:30:12 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Acquiring lock "/opt/stack/data/nova/instances/6a5593cd-3ab6-4859-b0fb-33a0ed702dc8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:12 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Lock "/opt/stack/data/nova/instances/6a5593cd-3ab6-4859-b0fb-33a0ed702dc8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:12 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Lock "/opt/stack/data/nova/instances/6a5593cd-3ab6-4859-b0fb-33a0ed702dc8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:12 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG nova.scheduler.client.report [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Inventory has not changed for provider f7548644-4a09-4ad8-9aa6-6e05d85a9f5b based on inventory data: {'MEMORY_MB': {'total': 16011, 'reserved': 512, 'min_unit': 1, 'max_unit': 16011, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70374) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.368s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG nova.compute.manager [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Start building networks asynchronously for instance. {{(pid=70374) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.140s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Acquiring lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG nova.compute.manager [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Allocating IP information in the background. {{(pid=70374) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG nova.network.neutron [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] allocate_for_instance() {{(pid=70374) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG nova.policy [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c87590f95d1147f48a94203d8be751ab', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0d99bef1aeee4c6090e60bdbb0ecbeda', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70374) authorize /opt/stack/nova/nova/policy.py:203}} Jul 27 09:30:13 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Jul 27 09:30:13 user nova-compute[70374]: DEBUG nova.compute.manager [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Start building block device mappings for instance. {{(pid=70374) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.158s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1,backing_fmt=raw /opt/stack/data/nova/instances/6a5593cd-3ab6-4859-b0fb-33a0ed702dc8/disk 1073741824 {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1,backing_fmt=raw /opt/stack/data/nova/instances/6a5593cd-3ab6-4859-b0fb-33a0ed702dc8/disk 1073741824" returned: 0 in 0.056s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.218s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG nova.compute.manager [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Start spawning the instance on the hypervisor. {{(pid=70374) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Creating instance directory {{(pid=70374) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Jul 27 09:30:13 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Creating image(s) Jul 27 09:30:13 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Acquiring lock "/opt/stack/data/nova/instances/6ae93f34-ce7e-4ae4-a5ba-36508361bd54/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Lock "/opt/stack/data/nova/instances/6ae93f34-ce7e-4ae4-a5ba-36508361bd54/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Lock "/opt/stack/data/nova/instances/6ae93f34-ce7e-4ae4-a5ba-36508361bd54/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.004s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.149s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG nova.virt.disk.api [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Checking if we can resize image /opt/stack/data/nova/instances/6a5593cd-3ab6-4859-b0fb-33a0ed702dc8/disk. size=1073741824 {{(pid=70374) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a5593cd-3ab6-4859-b0fb-33a0ed702dc8/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.135s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Acquiring lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG nova.network.neutron [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Successfully updated port: 6858719a-7fa4-4a64-ad8a-02c9274adb55 {{(pid=70374) _update_port /opt/stack/nova/nova/network/neutron.py:586}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG nova.policy [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd059c8dff3644f3c9c0c54498a4d78f7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd8acda82ef3f428fbb93847922a213d1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70374) authorize /opt/stack/nova/nova/policy.py:203}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Acquiring lock "refresh_cache-04a990b9-ed32-41e9-b384-f0886e8d1b49" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Acquired lock "refresh_cache-04a990b9-ed32-41e9-b384-f0886e8d1b49" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG nova.network.neutron [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Building network info cache for instance {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a5593cd-3ab6-4859-b0fb-33a0ed702dc8/disk --force-share --output=json" returned: 0 in 0.163s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG nova.virt.disk.api [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Cannot resize image /opt/stack/data/nova/instances/6a5593cd-3ab6-4859-b0fb-33a0ed702dc8/disk to a smaller size. {{(pid=70374) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG nova.objects.instance [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Lazy-loading 'migration_context' on Instance uuid 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Created local disks {{(pid=70374) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Ensure instance console log exists: /opt/stack/data/nova/instances/6a5593cd-3ab6-4859-b0fb-33a0ed702dc8/console.log {{(pid=70374) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.136s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1,backing_fmt=raw /opt/stack/data/nova/instances/6ae93f34-ce7e-4ae4-a5ba-36508361bd54/disk 1073741824 {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1,backing_fmt=raw /opt/stack/data/nova/instances/6ae93f34-ce7e-4ae4-a5ba-36508361bd54/disk 1073741824" returned: 0 in 0.052s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.197s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG nova.network.neutron [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Instance cache missing network info. {{(pid=70374) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.158s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG nova.virt.disk.api [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Checking if we can resize image /opt/stack/data/nova/instances/6ae93f34-ce7e-4ae4-a5ba-36508361bd54/disk. size=1073741824 {{(pid=70374) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Jul 27 09:30:13 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6ae93f34-ce7e-4ae4-a5ba-36508361bd54/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:14 user nova-compute[70374]: DEBUG nova.compute.manager [req-830af63e-4980-4c98-9db6-28b5819a706e req-0aacd555-0d49-4fb5-8d52-d132fad9c415 service nova] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Received event network-changed-6858719a-7fa4-4a64-ad8a-02c9274adb55 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:30:14 user nova-compute[70374]: DEBUG nova.compute.manager [req-830af63e-4980-4c98-9db6-28b5819a706e req-0aacd555-0d49-4fb5-8d52-d132fad9c415 service nova] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Refreshing instance network info cache due to event network-changed-6858719a-7fa4-4a64-ad8a-02c9274adb55. {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Jul 27 09:30:14 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-830af63e-4980-4c98-9db6-28b5819a706e req-0aacd555-0d49-4fb5-8d52-d132fad9c415 service nova] Acquiring lock "refresh_cache-04a990b9-ed32-41e9-b384-f0886e8d1b49" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:30:14 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6ae93f34-ce7e-4ae4-a5ba-36508361bd54/disk --force-share --output=json" returned: 0 in 0.148s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:14 user nova-compute[70374]: DEBUG nova.virt.disk.api [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Cannot resize image /opt/stack/data/nova/instances/6ae93f34-ce7e-4ae4-a5ba-36508361bd54/disk to a smaller size. {{(pid=70374) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Jul 27 09:30:14 user nova-compute[70374]: DEBUG nova.objects.instance [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Lazy-loading 'migration_context' on Instance uuid 6ae93f34-ce7e-4ae4-a5ba-36508361bd54 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:30:14 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Created local disks {{(pid=70374) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Jul 27 09:30:14 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Ensure instance console log exists: /opt/stack/data/nova/instances/6ae93f34-ce7e-4ae4-a5ba-36508361bd54/console.log {{(pid=70374) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Jul 27 09:30:14 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:14 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:14 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:14 user nova-compute[70374]: DEBUG nova.network.neutron [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Successfully created port: fd47b104-c8ff-4b76-8f3a-53725f9f318c {{(pid=70374) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.network.neutron [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Updating instance_info_cache with network_info: [{"id": "6858719a-7fa4-4a64-ad8a-02c9274adb55", "address": "fa:16:3e:e6:e0:42", "network": {"id": "e1a0003b-0f63-4efa-9d24-933f88eb0373", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-651468270-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "65e935bb36794aa98eb3e94c30e647d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6858719a-7f", "ovs_interfaceid": "6858719a-7fa4-4a64-ad8a-02c9274adb55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Releasing lock "refresh_cache-04a990b9-ed32-41e9-b384-f0886e8d1b49" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.compute.manager [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Instance network_info: |[{"id": "6858719a-7fa4-4a64-ad8a-02c9274adb55", "address": "fa:16:3e:e6:e0:42", "network": {"id": "e1a0003b-0f63-4efa-9d24-933f88eb0373", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-651468270-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "65e935bb36794aa98eb3e94c30e647d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6858719a-7f", "ovs_interfaceid": "6858719a-7fa4-4a64-ad8a-02c9274adb55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70374) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-830af63e-4980-4c98-9db6-28b5819a706e req-0aacd555-0d49-4fb5-8d52-d132fad9c415 service nova] Acquired lock "refresh_cache-04a990b9-ed32-41e9-b384-f0886e8d1b49" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.network.neutron [req-830af63e-4980-4c98-9db6-28b5819a706e req-0aacd555-0d49-4fb5-8d52-d132fad9c415 service nova] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Refreshing network info cache for port 6858719a-7fa4-4a64-ad8a-02c9274adb55 {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Start _get_guest_xml network_info=[{"id": "6858719a-7fa4-4a64-ad8a-02c9274adb55", "address": "fa:16:3e:e6:e0:42", "network": {"id": "e1a0003b-0f63-4efa-9d24-933f88eb0373", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-651468270-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "65e935bb36794aa98eb3e94c30e647d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6858719a-7f", "ovs_interfaceid": "6858719a-7fa4-4a64-ad8a-02c9274adb55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-07-27T09:26:27Z,direct_url=,disk_format='qcow2',id=35458adf-261a-4e0b-a4db-b243619b2394,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='37aff788807d4b2aafc10d355583f7c7',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-07-27T09:26:29Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'size': 0, 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'image_id': '35458adf-261a-4e0b-a4db-b243619b2394'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70374) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Jul 27 09:30:15 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:30:15 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:30:15 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Acquiring lock "8640c525-e6ba-4bf8-9fe0-2c08155dd1cb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Lock "8640c525-e6ba-4bf8-9fe0-2c08155dd1cb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Searching host: 'user' for CPU controller through CGroups V1... {{(pid=70374) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] CPU controller missing on host. {{(pid=70374) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Searching host: 'user' for CPU controller through CGroups V2... {{(pid=70374) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] CPU controller found on host. {{(pid=70374) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70374) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-07-27T09:26:16Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1',id=6,is_public=True,memory_mb=512,name='m1.tiny',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-07-27T09:26:27Z,direct_url=,disk_format='qcow2',id=35458adf-261a-4e0b-a4db-b243619b2394,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='37aff788807d4b2aafc10d355583f7c7',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-07-27T09:26:29Z,virtual_size=,visibility=), allow threads: True {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Flavor limits 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Image limits 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Flavor pref 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Image pref 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70374) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Got 1 possible topologies {{(pid=70374) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.privsep.utils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=70374) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.virt.libvirt.vif [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-07-27T09:30:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-503176185',display_name='tempest-ServersNegativeTestJSON-server-503176185',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(6),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-503176185',id=1,image_ref='35458adf-261a-4e0b-a4db-b243619b2394',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='65e935bb36794aa98eb3e94c30e647d9',ramdisk_id='',reservation_id='r-xsrj9cxm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='35458adf-261a-4e0b-a4db-b243619b2394',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1865798359',owner_user_name='tempest-ServersNegativeTestJSON-1865798359-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-07-27T09:30:08Z,user_data=None,user_id='544aa4a40ff941c2a52b24faa08a0d29',uuid=04a990b9-ed32-41e9-b384-f0886e8d1b49,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6858719a-7fa4-4a64-ad8a-02c9274adb55", "address": "fa:16:3e:e6:e0:42", "network": {"id": "e1a0003b-0f63-4efa-9d24-933f88eb0373", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-651468270-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "65e935bb36794aa98eb3e94c30e647d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6858719a-7f", "ovs_interfaceid": "6858719a-7fa4-4a64-ad8a-02c9274adb55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70374) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Converting VIF {"id": "6858719a-7fa4-4a64-ad8a-02c9274adb55", "address": "fa:16:3e:e6:e0:42", "network": {"id": "e1a0003b-0f63-4efa-9d24-933f88eb0373", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-651468270-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "65e935bb36794aa98eb3e94c30e647d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6858719a-7f", "ovs_interfaceid": "6858719a-7fa4-4a64-ad8a-02c9274adb55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e6:e0:42,bridge_name='br-int',has_traffic_filtering=True,id=6858719a-7fa4-4a64-ad8a-02c9274adb55,network=Network(e1a0003b-0f63-4efa-9d24-933f88eb0373),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6858719a-7f') {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.objects.instance [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Lazy-loading 'pci_devices' on Instance uuid 04a990b9-ed32-41e9-b384-f0886e8d1b49 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.compute.manager [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Starting instance... {{(pid=70374) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] End _get_guest_xml xml= Jul 27 09:30:15 user nova-compute[70374]: 04a990b9-ed32-41e9-b384-f0886e8d1b49 Jul 27 09:30:15 user nova-compute[70374]: instance-00000001 Jul 27 09:30:15 user nova-compute[70374]: 524288 Jul 27 09:30:15 user nova-compute[70374]: 1 Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: tempest-ServersNegativeTestJSON-server-503176185 Jul 27 09:30:15 user nova-compute[70374]: 2023-07-27 09:30:15 Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: 512 Jul 27 09:30:15 user nova-compute[70374]: 1 Jul 27 09:30:15 user nova-compute[70374]: 0 Jul 27 09:30:15 user nova-compute[70374]: 0 Jul 27 09:30:15 user nova-compute[70374]: 1 Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: tempest-ServersNegativeTestJSON-1865798359-project-member Jul 27 09:30:15 user nova-compute[70374]: tempest-ServersNegativeTestJSON-1865798359 Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: OpenStack Foundation Jul 27 09:30:15 user nova-compute[70374]: OpenStack Nova Jul 27 09:30:15 user nova-compute[70374]: 0.0.0 Jul 27 09:30:15 user nova-compute[70374]: 04a990b9-ed32-41e9-b384-f0886e8d1b49 Jul 27 09:30:15 user nova-compute[70374]: 04a990b9-ed32-41e9-b384-f0886e8d1b49 Jul 27 09:30:15 user nova-compute[70374]: Virtual Machine Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: hvm Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Nehalem Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: /dev/urandom Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: Jul 27 09:30:15 user nova-compute[70374]: {{(pid=70374) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.virt.libvirt.vif [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-07-27T09:30:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-503176185',display_name='tempest-ServersNegativeTestJSON-server-503176185',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(6),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-503176185',id=1,image_ref='35458adf-261a-4e0b-a4db-b243619b2394',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='65e935bb36794aa98eb3e94c30e647d9',ramdisk_id='',reservation_id='r-xsrj9cxm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='35458adf-261a-4e0b-a4db-b243619b2394',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1865798359',owner_user_name='tempest-ServersNegativeTestJSON-1865798359-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-07-27T09:30:08Z,user_data=None,user_id='544aa4a40ff941c2a52b24faa08a0d29',uuid=04a990b9-ed32-41e9-b384-f0886e8d1b49,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6858719a-7fa4-4a64-ad8a-02c9274adb55", "address": "fa:16:3e:e6:e0:42", "network": {"id": "e1a0003b-0f63-4efa-9d24-933f88eb0373", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-651468270-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "65e935bb36794aa98eb3e94c30e647d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6858719a-7f", "ovs_interfaceid": "6858719a-7fa4-4a64-ad8a-02c9274adb55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Converting VIF {"id": "6858719a-7fa4-4a64-ad8a-02c9274adb55", "address": "fa:16:3e:e6:e0:42", "network": {"id": "e1a0003b-0f63-4efa-9d24-933f88eb0373", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-651468270-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "65e935bb36794aa98eb3e94c30e647d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6858719a-7f", "ovs_interfaceid": "6858719a-7fa4-4a64-ad8a-02c9274adb55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e6:e0:42,bridge_name='br-int',has_traffic_filtering=True,id=6858719a-7fa4-4a64-ad8a-02c9274adb55,network=Network(e1a0003b-0f63-4efa-9d24-933f88eb0373),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6858719a-7f') {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG os_vif [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e6:e0:42,bridge_name='br-int',has_traffic_filtering=True,id=6858719a-7fa4-4a64-ad8a-02c9274adb55,network=Network(e1a0003b-0f63-4efa-9d24-933f88eb0373),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6858719a-7f') {{(pid=70374) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Created schema index Interface.name {{(pid=70374) autocreate_indices /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/__init__.py:106}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Created schema index Port.name {{(pid=70374) autocreate_indices /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/__init__.py:106}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Created schema index Bridge.name {{(pid=70374) autocreate_indices /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/__init__.py:106}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] tcp:127.0.0.1:6640: entering CONNECTING {{(pid=70374) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [POLLOUT] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=70374) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70374) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Jul 27 09:30:15 user nova-compute[70374]: INFO nova.compute.claims [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Claim successful on node user Jul 27 09:30:15 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Jul 27 09:30:15 user nova-compute[70374]: INFO oslo.privsep.daemon [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova-cpu.conf', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp9y63cp12/privsep.sock'] Jul 27 09:30:15 user sudo[79592]: stack : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/nova-rootwrap /etc/nova/rootwrap.conf privsep-helper --config-file /etc/nova/nova-cpu.conf --privsep_context vif_plug_ovs.privsep.vif_plug --privsep_sock_path /tmp/tmp9y63cp12/privsep.sock Jul 27 09:30:15 user sudo[79592]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1001) Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.network.neutron [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Successfully created port: 98a9819c-4270-4b75-b851-635a3b19a7b4 {{(pid=70374) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.compute.provider_tree [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Inventory has not changed in ProviderTree for provider: f7548644-4a09-4ad8-9aa6-6e05d85a9f5b {{(pid=70374) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.scheduler.client.report [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Inventory has not changed for provider f7548644-4a09-4ad8-9aa6-6e05d85a9f5b based on inventory data: {'MEMORY_MB': {'total': 16011, 'reserved': 512, 'min_unit': 1, 'max_unit': 16011, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70374) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.385s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.compute.manager [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Start building networks asynchronously for instance. {{(pid=70374) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.compute.manager [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Allocating IP information in the background. {{(pid=70374) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.network.neutron [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] allocate_for_instance() {{(pid=70374) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Jul 27 09:30:15 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Jul 27 09:30:15 user nova-compute[70374]: DEBUG nova.compute.manager [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Start building block device mappings for instance. {{(pid=70374) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Jul 27 09:30:16 user nova-compute[70374]: DEBUG nova.compute.manager [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Start spawning the instance on the hypervisor. {{(pid=70374) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Jul 27 09:30:16 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Creating instance directory {{(pid=70374) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Jul 27 09:30:16 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Creating image(s) Jul 27 09:30:16 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Acquiring lock "/opt/stack/data/nova/instances/8640c525-e6ba-4bf8-9fe0-2c08155dd1cb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:16 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Lock "/opt/stack/data/nova/instances/8640c525-e6ba-4bf8-9fe0-2c08155dd1cb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:16 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Lock "/opt/stack/data/nova/instances/8640c525-e6ba-4bf8-9fe0-2c08155dd1cb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:16 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:16 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.137s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:16 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Acquiring lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:16 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:16 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:16 user nova-compute[70374]: DEBUG nova.policy [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4c1d49c23b0045d881d9f47e33447162', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bfaa69b3745a435795aa636ccddec3af', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70374) authorize /opt/stack/nova/nova/policy.py:203}} Jul 27 09:30:16 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.158s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:16 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1,backing_fmt=raw /opt/stack/data/nova/instances/8640c525-e6ba-4bf8-9fe0-2c08155dd1cb/disk 1073741824 {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:16 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1,backing_fmt=raw /opt/stack/data/nova/instances/8640c525-e6ba-4bf8-9fe0-2c08155dd1cb/disk 1073741824" returned: 0 in 0.062s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:16 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.225s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:16 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:16 user nova-compute[70374]: DEBUG nova.network.neutron [req-830af63e-4980-4c98-9db6-28b5819a706e req-0aacd555-0d49-4fb5-8d52-d132fad9c415 service nova] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Updated VIF entry in instance network info cache for port 6858719a-7fa4-4a64-ad8a-02c9274adb55. {{(pid=70374) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Jul 27 09:30:16 user nova-compute[70374]: DEBUG nova.network.neutron [req-830af63e-4980-4c98-9db6-28b5819a706e req-0aacd555-0d49-4fb5-8d52-d132fad9c415 service nova] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Updating instance_info_cache with network_info: [{"id": "6858719a-7fa4-4a64-ad8a-02c9274adb55", "address": "fa:16:3e:e6:e0:42", "network": {"id": "e1a0003b-0f63-4efa-9d24-933f88eb0373", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-651468270-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "65e935bb36794aa98eb3e94c30e647d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6858719a-7f", "ovs_interfaceid": "6858719a-7fa4-4a64-ad8a-02c9274adb55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:30:16 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-830af63e-4980-4c98-9db6-28b5819a706e req-0aacd555-0d49-4fb5-8d52-d132fad9c415 service nova] Releasing lock "refresh_cache-04a990b9-ed32-41e9-b384-f0886e8d1b49" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:30:16 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Acquiring lock "25214e8a-c626-46a7-b273-eb491c2fc91b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:16 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Lock "25214e8a-c626-46a7-b273-eb491c2fc91b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:16 user nova-compute[70374]: DEBUG nova.compute.manager [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Starting instance... {{(pid=70374) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Jul 27 09:30:16 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.151s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:16 user nova-compute[70374]: DEBUG nova.virt.disk.api [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Checking if we can resize image /opt/stack/data/nova/instances/8640c525-e6ba-4bf8-9fe0-2c08155dd1cb/disk. size=1073741824 {{(pid=70374) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Jul 27 09:30:16 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8640c525-e6ba-4bf8-9fe0-2c08155dd1cb/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:16 user nova-compute[70374]: DEBUG nova.network.neutron [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Successfully created port: ce2809b5-cc30-4a29-9770-7df186b16406 {{(pid=70374) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} Jul 27 09:30:16 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:16 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:16 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70374) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Jul 27 09:30:16 user nova-compute[70374]: INFO nova.compute.claims [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Claim successful on node user Jul 27 09:30:16 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8640c525-e6ba-4bf8-9fe0-2c08155dd1cb/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:16 user nova-compute[70374]: DEBUG nova.virt.disk.api [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Cannot resize image /opt/stack/data/nova/instances/8640c525-e6ba-4bf8-9fe0-2c08155dd1cb/disk to a smaller size. {{(pid=70374) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Jul 27 09:30:16 user nova-compute[70374]: DEBUG nova.objects.instance [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Lazy-loading 'migration_context' on Instance uuid 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:30:16 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Created local disks {{(pid=70374) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Jul 27 09:30:16 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Ensure instance console log exists: /opt/stack/data/nova/instances/8640c525-e6ba-4bf8-9fe0-2c08155dd1cb/console.log {{(pid=70374) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Jul 27 09:30:16 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:16 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:16 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:16 user sudo[79592]: pam_unix(sudo:session): session closed for user root Jul 27 09:30:16 user nova-compute[70374]: INFO oslo.privsep.daemon [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Spawned new privsep daemon via rootwrap Jul 27 09:30:16 user nova-compute[70374]: INFO oslo.privsep.daemon [-] privsep daemon starting Jul 27 09:30:16 user nova-compute[70374]: INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0 Jul 27 09:30:16 user nova-compute[70374]: INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none Jul 27 09:30:16 user nova-compute[70374]: INFO oslo.privsep.daemon [-] privsep daemon running as pid 79611 Jul 27 09:30:17 user nova-compute[70374]: DEBUG nova.compute.provider_tree [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Inventory has not changed in ProviderTree for provider: f7548644-4a09-4ad8-9aa6-6e05d85a9f5b {{(pid=70374) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG nova.network.neutron [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Successfully updated port: 98a9819c-4270-4b75-b851-635a3b19a7b4 {{(pid=70374) _update_port /opt/stack/nova/nova/network/neutron.py:586}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG nova.scheduler.client.report [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Inventory has not changed for provider f7548644-4a09-4ad8-9aa6-6e05d85a9f5b based on inventory data: {'MEMORY_MB': {'total': 16011, 'reserved': 512, 'min_unit': 1, 'max_unit': 16011, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70374) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Acquiring lock "refresh_cache-6a5593cd-3ab6-4859-b0fb-33a0ed702dc8" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Acquired lock "refresh_cache-6a5593cd-3ab6-4859-b0fb-33a0ed702dc8" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG nova.network.neutron [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Building network info cache for instance {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG nova.compute.manager [req-2bf1a339-43ce-4d5c-ac03-cf923f486ace req-c8843656-4cf7-4a3a-a9cb-845eccbe7e9e service nova] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Received event network-changed-98a9819c-4270-4b75-b851-635a3b19a7b4 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG nova.compute.manager [req-2bf1a339-43ce-4d5c-ac03-cf923f486ace req-c8843656-4cf7-4a3a-a9cb-845eccbe7e9e service nova] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Refreshing instance network info cache due to event network-changed-98a9819c-4270-4b75-b851-635a3b19a7b4. {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-2bf1a339-43ce-4d5c-ac03-cf923f486ace req-c8843656-4cf7-4a3a-a9cb-845eccbe7e9e service nova] Acquiring lock "refresh_cache-6a5593cd-3ab6-4859-b0fb-33a0ed702dc8" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.496s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG nova.compute.manager [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Start building networks asynchronously for instance. {{(pid=70374) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG nova.network.neutron [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Instance cache missing network info. {{(pid=70374) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG nova.compute.manager [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Allocating IP information in the background. {{(pid=70374) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG nova.network.neutron [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] allocate_for_instance() {{(pid=70374) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Jul 27 09:30:17 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Jul 27 09:30:17 user nova-compute[70374]: DEBUG nova.compute.manager [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Start building block device mappings for instance. {{(pid=70374) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG nova.compute.manager [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Start spawning the instance on the hypervisor. {{(pid=70374) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Creating instance directory {{(pid=70374) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Jul 27 09:30:17 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Creating image(s) Jul 27 09:30:17 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Acquiring lock "/opt/stack/data/nova/instances/25214e8a-c626-46a7-b273-eb491c2fc91b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Lock "/opt/stack/data/nova/instances/25214e8a-c626-46a7-b273-eb491c2fc91b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Lock "/opt/stack/data/nova/instances/25214e8a-c626-46a7-b273-eb491c2fc91b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.007s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6858719a-7f, may_exist=True, interface_attrs={}) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6858719a-7f, col_values=(('external_ids', {'iface-id': '6858719a-7fa4-4a64-ad8a-02c9274adb55', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e6:e0:42', 'vm-uuid': '04a990b9-ed32-41e9-b384-f0886e8d1b49'}),), if_exists=True) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:17 user nova-compute[70374]: INFO os_vif [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e6:e0:42,bridge_name='br-int',has_traffic_filtering=True,id=6858719a-7fa4-4a64-ad8a-02c9274adb55,network=Network(e1a0003b-0f63-4efa-9d24-933f88eb0373),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6858719a-7f') Jul 27 09:30:17 user nova-compute[70374]: DEBUG nova.policy [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9d4c5ec795ee4d3387ce8718d2ac67e0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0592f0be670742a181e24823955f378b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70374) authorize /opt/stack/nova/nova/policy.py:203}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] No BDM found with device name vda, not building metadata. {{(pid=70374) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] No VIF found with MAC fa:16:3e:e6:e0:42, not building metadata {{(pid=70374) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.152s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Acquiring lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.119s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1,backing_fmt=raw /opt/stack/data/nova/instances/25214e8a-c626-46a7-b273-eb491c2fc91b/disk 1073741824 {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1,backing_fmt=raw /opt/stack/data/nova/instances/25214e8a-c626-46a7-b273-eb491c2fc91b/disk 1073741824" returned: 0 in 0.055s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.178s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.188s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG nova.virt.disk.api [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Checking if we can resize image /opt/stack/data/nova/instances/25214e8a-c626-46a7-b273-eb491c2fc91b/disk. size=1073741824 {{(pid=70374) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Jul 27 09:30:17 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/25214e8a-c626-46a7-b273-eb491c2fc91b/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.network.neutron [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Successfully created port: e6dc311e-c8d0-4b0c-bf42-96919f9d6f02 {{(pid=70374) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.network.neutron [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Updating instance_info_cache with network_info: [{"id": "98a9819c-4270-4b75-b851-635a3b19a7b4", "address": "fa:16:3e:de:fb:c0", "network": {"id": "6faafed3-97ab-43b5-bf25-0d0a468935f8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-754617940-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "0d99bef1aeee4c6090e60bdbb0ecbeda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap98a9819c-42", "ovs_interfaceid": "98a9819c-4270-4b75-b851-635a3b19a7b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/25214e8a-c626-46a7-b273-eb491c2fc91b/disk --force-share --output=json" returned: 0 in 0.150s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.virt.disk.api [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Cannot resize image /opt/stack/data/nova/instances/25214e8a-c626-46a7-b273-eb491c2fc91b/disk to a smaller size. {{(pid=70374) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.objects.instance [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Lazy-loading 'migration_context' on Instance uuid 25214e8a-c626-46a7-b273-eb491c2fc91b {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Created local disks {{(pid=70374) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Ensure instance console log exists: /opt/stack/data/nova/instances/25214e8a-c626-46a7-b273-eb491c2fc91b/console.log {{(pid=70374) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Releasing lock "refresh_cache-6a5593cd-3ab6-4859-b0fb-33a0ed702dc8" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.compute.manager [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Instance network_info: |[{"id": "98a9819c-4270-4b75-b851-635a3b19a7b4", "address": "fa:16:3e:de:fb:c0", "network": {"id": "6faafed3-97ab-43b5-bf25-0d0a468935f8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-754617940-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "0d99bef1aeee4c6090e60bdbb0ecbeda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap98a9819c-42", "ovs_interfaceid": "98a9819c-4270-4b75-b851-635a3b19a7b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70374) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-2bf1a339-43ce-4d5c-ac03-cf923f486ace req-c8843656-4cf7-4a3a-a9cb-845eccbe7e9e service nova] Acquired lock "refresh_cache-6a5593cd-3ab6-4859-b0fb-33a0ed702dc8" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.network.neutron [req-2bf1a339-43ce-4d5c-ac03-cf923f486ace req-c8843656-4cf7-4a3a-a9cb-845eccbe7e9e service nova] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Refreshing network info cache for port 98a9819c-4270-4b75-b851-635a3b19a7b4 {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Start _get_guest_xml network_info=[{"id": "98a9819c-4270-4b75-b851-635a3b19a7b4", "address": "fa:16:3e:de:fb:c0", "network": {"id": "6faafed3-97ab-43b5-bf25-0d0a468935f8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-754617940-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "0d99bef1aeee4c6090e60bdbb0ecbeda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap98a9819c-42", "ovs_interfaceid": "98a9819c-4270-4b75-b851-635a3b19a7b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-07-27T09:26:27Z,direct_url=,disk_format='qcow2',id=35458adf-261a-4e0b-a4db-b243619b2394,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='37aff788807d4b2aafc10d355583f7c7',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-07-27T09:26:29Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'size': 0, 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'image_id': '35458adf-261a-4e0b-a4db-b243619b2394'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70374) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Jul 27 09:30:18 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:30:18 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Searching host: 'user' for CPU controller through CGroups V1... {{(pid=70374) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] CPU controller missing on host. {{(pid=70374) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Searching host: 'user' for CPU controller through CGroups V2... {{(pid=70374) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] CPU controller found on host. {{(pid=70374) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70374) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-07-27T09:26:16Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1',id=6,is_public=True,memory_mb=512,name='m1.tiny',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-07-27T09:26:27Z,direct_url=,disk_format='qcow2',id=35458adf-261a-4e0b-a4db-b243619b2394,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='37aff788807d4b2aafc10d355583f7c7',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-07-27T09:26:29Z,virtual_size=,visibility=), allow threads: True {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Flavor limits 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Image limits 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Flavor pref 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Image pref 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70374) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Got 1 possible topologies {{(pid=70374) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.virt.libvirt.vif [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-07-27T09:30:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1737227100',display_name='tempest-DeleteServersTestJSON-server-1737227100',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(6),hidden=False,host='user',hostname='tempest-deleteserverstestjson-server-1737227100',id=3,image_ref='35458adf-261a-4e0b-a4db-b243619b2394',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0d99bef1aeee4c6090e60bdbb0ecbeda',ramdisk_id='',reservation_id='r-vup3cqqc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='35458adf-261a-4e0b-a4db-b243619b2394',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-322957873',owner_user_name='tempest-DeleteServersTestJSON-322957873-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-07-27T09:30:13Z,user_data=None,user_id='c87590f95d1147f48a94203d8be751ab',uuid=6a5593cd-3ab6-4859-b0fb-33a0ed702dc8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "98a9819c-4270-4b75-b851-635a3b19a7b4", "address": "fa:16:3e:de:fb:c0", "network": {"id": "6faafed3-97ab-43b5-bf25-0d0a468935f8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-754617940-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "0d99bef1aeee4c6090e60bdbb0ecbeda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap98a9819c-42", "ovs_interfaceid": "98a9819c-4270-4b75-b851-635a3b19a7b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70374) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Converting VIF {"id": "98a9819c-4270-4b75-b851-635a3b19a7b4", "address": "fa:16:3e:de:fb:c0", "network": {"id": "6faafed3-97ab-43b5-bf25-0d0a468935f8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-754617940-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "0d99bef1aeee4c6090e60bdbb0ecbeda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap98a9819c-42", "ovs_interfaceid": "98a9819c-4270-4b75-b851-635a3b19a7b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:de:fb:c0,bridge_name='br-int',has_traffic_filtering=True,id=98a9819c-4270-4b75-b851-635a3b19a7b4,network=Network(6faafed3-97ab-43b5-bf25-0d0a468935f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap98a9819c-42') {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.objects.instance [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Lazy-loading 'pci_devices' on Instance uuid 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] End _get_guest_xml xml= Jul 27 09:30:18 user nova-compute[70374]: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8 Jul 27 09:30:18 user nova-compute[70374]: instance-00000003 Jul 27 09:30:18 user nova-compute[70374]: 524288 Jul 27 09:30:18 user nova-compute[70374]: 1 Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: tempest-DeleteServersTestJSON-server-1737227100 Jul 27 09:30:18 user nova-compute[70374]: 2023-07-27 09:30:18 Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: 512 Jul 27 09:30:18 user nova-compute[70374]: 1 Jul 27 09:30:18 user nova-compute[70374]: 0 Jul 27 09:30:18 user nova-compute[70374]: 0 Jul 27 09:30:18 user nova-compute[70374]: 1 Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: tempest-DeleteServersTestJSON-322957873-project-member Jul 27 09:30:18 user nova-compute[70374]: tempest-DeleteServersTestJSON-322957873 Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: OpenStack Foundation Jul 27 09:30:18 user nova-compute[70374]: OpenStack Nova Jul 27 09:30:18 user nova-compute[70374]: 0.0.0 Jul 27 09:30:18 user nova-compute[70374]: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8 Jul 27 09:30:18 user nova-compute[70374]: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8 Jul 27 09:30:18 user nova-compute[70374]: Virtual Machine Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: hvm Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Nehalem Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: /dev/urandom Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: Jul 27 09:30:18 user nova-compute[70374]: {{(pid=70374) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.virt.libvirt.vif [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-07-27T09:30:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1737227100',display_name='tempest-DeleteServersTestJSON-server-1737227100',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(6),hidden=False,host='user',hostname='tempest-deleteserverstestjson-server-1737227100',id=3,image_ref='35458adf-261a-4e0b-a4db-b243619b2394',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0d99bef1aeee4c6090e60bdbb0ecbeda',ramdisk_id='',reservation_id='r-vup3cqqc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='35458adf-261a-4e0b-a4db-b243619b2394',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-322957873',owner_user_name='tempest-DeleteServersTestJSON-322957873-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-07-27T09:30:13Z,user_data=None,user_id='c87590f95d1147f48a94203d8be751ab',uuid=6a5593cd-3ab6-4859-b0fb-33a0ed702dc8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "98a9819c-4270-4b75-b851-635a3b19a7b4", "address": "fa:16:3e:de:fb:c0", "network": {"id": "6faafed3-97ab-43b5-bf25-0d0a468935f8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-754617940-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "0d99bef1aeee4c6090e60bdbb0ecbeda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap98a9819c-42", "ovs_interfaceid": "98a9819c-4270-4b75-b851-635a3b19a7b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Converting VIF {"id": "98a9819c-4270-4b75-b851-635a3b19a7b4", "address": "fa:16:3e:de:fb:c0", "network": {"id": "6faafed3-97ab-43b5-bf25-0d0a468935f8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-754617940-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "0d99bef1aeee4c6090e60bdbb0ecbeda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap98a9819c-42", "ovs_interfaceid": "98a9819c-4270-4b75-b851-635a3b19a7b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:de:fb:c0,bridge_name='br-int',has_traffic_filtering=True,id=98a9819c-4270-4b75-b851-635a3b19a7b4,network=Network(6faafed3-97ab-43b5-bf25-0d0a468935f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap98a9819c-42') {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG os_vif [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:fb:c0,bridge_name='br-int',has_traffic_filtering=True,id=98a9819c-4270-4b75-b851-635a3b19a7b4,network=Network(6faafed3-97ab-43b5-bf25-0d0a468935f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap98a9819c-42') {{(pid=70374) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap98a9819c-42, may_exist=True, interface_attrs={}) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap98a9819c-42, col_values=(('external_ids', {'iface-id': '98a9819c-4270-4b75-b851-635a3b19a7b4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:de:fb:c0', 'vm-uuid': '6a5593cd-3ab6-4859-b0fb-33a0ed702dc8'}),), if_exists=True) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:18 user nova-compute[70374]: INFO os_vif [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:fb:c0,bridge_name='br-int',has_traffic_filtering=True,id=98a9819c-4270-4b75-b851-635a3b19a7b4,network=Network(6faafed3-97ab-43b5-bf25-0d0a468935f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap98a9819c-42') Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] No BDM found with device name vda, not building metadata. {{(pid=70374) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] No VIF found with MAC fa:16:3e:de:fb:c0, not building metadata {{(pid=70374) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.network.neutron [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Successfully updated port: fd47b104-c8ff-4b76-8f3a-53725f9f318c {{(pid=70374) _update_port /opt/stack/nova/nova/network/neutron.py:586}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Acquiring lock "refresh_cache-fb5ccac9-1e45-4726-b681-cf34cf3fa521" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Acquired lock "refresh_cache-fb5ccac9-1e45-4726-b681-cf34cf3fa521" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.network.neutron [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Building network info cache for instance {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.network.neutron [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Instance cache missing network info. {{(pid=70374) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.compute.manager [req-eb38fdd4-8a1b-4d35-9f46-904a861b2f22 req-96424af7-e05d-4f97-bd1a-c30984cfd118 service nova] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Received event network-changed-fd47b104-c8ff-4b76-8f3a-53725f9f318c {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG nova.compute.manager [req-eb38fdd4-8a1b-4d35-9f46-904a861b2f22 req-96424af7-e05d-4f97-bd1a-c30984cfd118 service nova] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Refreshing instance network info cache due to event network-changed-fd47b104-c8ff-4b76-8f3a-53725f9f318c. {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Jul 27 09:30:18 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-eb38fdd4-8a1b-4d35-9f46-904a861b2f22 req-96424af7-e05d-4f97-bd1a-c30984cfd118 service nova] Acquiring lock "refresh_cache-fb5ccac9-1e45-4726-b681-cf34cf3fa521" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:30:19 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:19 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:19 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:19 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:19 user nova-compute[70374]: DEBUG nova.network.neutron [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Successfully updated port: ce2809b5-cc30-4a29-9770-7df186b16406 {{(pid=70374) _update_port /opt/stack/nova/nova/network/neutron.py:586}} Jul 27 09:30:19 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Acquiring lock "refresh_cache-6ae93f34-ce7e-4ae4-a5ba-36508361bd54" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:30:19 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Acquired lock "refresh_cache-6ae93f34-ce7e-4ae4-a5ba-36508361bd54" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:30:19 user nova-compute[70374]: DEBUG nova.network.neutron [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Building network info cache for instance {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Jul 27 09:30:19 user nova-compute[70374]: DEBUG nova.network.neutron [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Instance cache missing network info. {{(pid=70374) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Jul 27 09:30:19 user nova-compute[70374]: DEBUG nova.compute.manager [req-a765d3c0-bbef-48fa-b350-6a518a5ea414 req-28e37a0c-07b4-407a-bf17-9ba09dabc2e4 service nova] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Received event network-changed-ce2809b5-cc30-4a29-9770-7df186b16406 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:30:19 user nova-compute[70374]: DEBUG nova.compute.manager [req-a765d3c0-bbef-48fa-b350-6a518a5ea414 req-28e37a0c-07b4-407a-bf17-9ba09dabc2e4 service nova] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Refreshing instance network info cache due to event network-changed-ce2809b5-cc30-4a29-9770-7df186b16406. {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Jul 27 09:30:19 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-a765d3c0-bbef-48fa-b350-6a518a5ea414 req-28e37a0c-07b4-407a-bf17-9ba09dabc2e4 service nova] Acquiring lock "refresh_cache-6ae93f34-ce7e-4ae4-a5ba-36508361bd54" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG nova.network.neutron [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Updating instance_info_cache with network_info: [{"id": "fd47b104-c8ff-4b76-8f3a-53725f9f318c", "address": "fa:16:3e:57:cb:7b", "network": {"id": "de376ae8-a0af-4eda-95e8-dfb1ca0f2283", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1261629972-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8acda82ef3f428fbb93847922a213d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd47b104-c8", "ovs_interfaceid": "fd47b104-c8ff-4b76-8f3a-53725f9f318c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Releasing lock "refresh_cache-fb5ccac9-1e45-4726-b681-cf34cf3fa521" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG nova.compute.manager [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Instance network_info: |[{"id": "fd47b104-c8ff-4b76-8f3a-53725f9f318c", "address": "fa:16:3e:57:cb:7b", "network": {"id": "de376ae8-a0af-4eda-95e8-dfb1ca0f2283", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1261629972-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8acda82ef3f428fbb93847922a213d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd47b104-c8", "ovs_interfaceid": "fd47b104-c8ff-4b76-8f3a-53725f9f318c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70374) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-eb38fdd4-8a1b-4d35-9f46-904a861b2f22 req-96424af7-e05d-4f97-bd1a-c30984cfd118 service nova] Acquired lock "refresh_cache-fb5ccac9-1e45-4726-b681-cf34cf3fa521" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG nova.network.neutron [req-eb38fdd4-8a1b-4d35-9f46-904a861b2f22 req-96424af7-e05d-4f97-bd1a-c30984cfd118 service nova] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Refreshing network info cache for port fd47b104-c8ff-4b76-8f3a-53725f9f318c {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Start _get_guest_xml network_info=[{"id": "fd47b104-c8ff-4b76-8f3a-53725f9f318c", "address": "fa:16:3e:57:cb:7b", "network": {"id": "de376ae8-a0af-4eda-95e8-dfb1ca0f2283", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1261629972-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8acda82ef3f428fbb93847922a213d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd47b104-c8", "ovs_interfaceid": "fd47b104-c8ff-4b76-8f3a-53725f9f318c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-07-27T09:26:27Z,direct_url=,disk_format='qcow2',id=35458adf-261a-4e0b-a4db-b243619b2394,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='37aff788807d4b2aafc10d355583f7c7',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-07-27T09:26:29Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'size': 0, 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'image_id': '35458adf-261a-4e0b-a4db-b243619b2394'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70374) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Jul 27 09:30:20 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:30:20 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:30:20 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Searching host: 'user' for CPU controller through CGroups V1... {{(pid=70374) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] CPU controller missing on host. {{(pid=70374) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Searching host: 'user' for CPU controller through CGroups V2... {{(pid=70374) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] CPU controller found on host. {{(pid=70374) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70374) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-07-27T09:26:16Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1',id=6,is_public=True,memory_mb=512,name='m1.tiny',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-07-27T09:26:27Z,direct_url=,disk_format='qcow2',id=35458adf-261a-4e0b-a4db-b243619b2394,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='37aff788807d4b2aafc10d355583f7c7',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-07-27T09:26:29Z,virtual_size=,visibility=), allow threads: True {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Flavor limits 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Image limits 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Flavor pref 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Image pref 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70374) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Got 1 possible topologies {{(pid=70374) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG nova.virt.libvirt.vif [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-07-27T09:30:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-2041850193',display_name='tempest-ServerRescueNegativeTestJSON-server-2041850193',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(6),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-2041850193',id=2,image_ref='35458adf-261a-4e0b-a4db-b243619b2394',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d8acda82ef3f428fbb93847922a213d1',ramdisk_id='',reservation_id='r-44a0z2kc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='35458adf-261a-4e0b-a4db-b243619b2394',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-241559119',owner_user_name='tempest-ServerRescueNegativeTestJSON-241559119-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-07-27T09:30:11Z,user_data=None,user_id='d059c8dff3644f3c9c0c54498a4d78f7',uuid=fb5ccac9-1e45-4726-b681-cf34cf3fa521,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fd47b104-c8ff-4b76-8f3a-53725f9f318c", "address": "fa:16:3e:57:cb:7b", "network": {"id": "de376ae8-a0af-4eda-95e8-dfb1ca0f2283", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1261629972-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8acda82ef3f428fbb93847922a213d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd47b104-c8", "ovs_interfaceid": "fd47b104-c8ff-4b76-8f3a-53725f9f318c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70374) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Converting VIF {"id": "fd47b104-c8ff-4b76-8f3a-53725f9f318c", "address": "fa:16:3e:57:cb:7b", "network": {"id": "de376ae8-a0af-4eda-95e8-dfb1ca0f2283", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1261629972-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8acda82ef3f428fbb93847922a213d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd47b104-c8", "ovs_interfaceid": "fd47b104-c8ff-4b76-8f3a-53725f9f318c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:cb:7b,bridge_name='br-int',has_traffic_filtering=True,id=fd47b104-c8ff-4b76-8f3a-53725f9f318c,network=Network(de376ae8-a0af-4eda-95e8-dfb1ca0f2283),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd47b104-c8') {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG nova.objects.instance [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Lazy-loading 'pci_devices' on Instance uuid fb5ccac9-1e45-4726-b681-cf34cf3fa521 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] End _get_guest_xml xml= Jul 27 09:30:20 user nova-compute[70374]: fb5ccac9-1e45-4726-b681-cf34cf3fa521 Jul 27 09:30:20 user nova-compute[70374]: instance-00000002 Jul 27 09:30:20 user nova-compute[70374]: 524288 Jul 27 09:30:20 user nova-compute[70374]: 1 Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: tempest-ServerRescueNegativeTestJSON-server-2041850193 Jul 27 09:30:20 user nova-compute[70374]: 2023-07-27 09:30:20 Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: 512 Jul 27 09:30:20 user nova-compute[70374]: 1 Jul 27 09:30:20 user nova-compute[70374]: 0 Jul 27 09:30:20 user nova-compute[70374]: 0 Jul 27 09:30:20 user nova-compute[70374]: 1 Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: tempest-ServerRescueNegativeTestJSON-241559119-project-member Jul 27 09:30:20 user nova-compute[70374]: tempest-ServerRescueNegativeTestJSON-241559119 Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: OpenStack Foundation Jul 27 09:30:20 user nova-compute[70374]: OpenStack Nova Jul 27 09:30:20 user nova-compute[70374]: 0.0.0 Jul 27 09:30:20 user nova-compute[70374]: fb5ccac9-1e45-4726-b681-cf34cf3fa521 Jul 27 09:30:20 user nova-compute[70374]: fb5ccac9-1e45-4726-b681-cf34cf3fa521 Jul 27 09:30:20 user nova-compute[70374]: Virtual Machine Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: hvm Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Nehalem Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: /dev/urandom Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: Jul 27 09:30:20 user nova-compute[70374]: {{(pid=70374) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG nova.virt.libvirt.vif [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-07-27T09:30:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-2041850193',display_name='tempest-ServerRescueNegativeTestJSON-server-2041850193',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(6),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-2041850193',id=2,image_ref='35458adf-261a-4e0b-a4db-b243619b2394',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d8acda82ef3f428fbb93847922a213d1',ramdisk_id='',reservation_id='r-44a0z2kc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='35458adf-261a-4e0b-a4db-b243619b2394',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-241559119',owner_user_name='tempest-ServerRescueNegativeTestJSON-241559119-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-07-27T09:30:11Z,user_data=None,user_id='d059c8dff3644f3c9c0c54498a4d78f7',uuid=fb5ccac9-1e45-4726-b681-cf34cf3fa521,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fd47b104-c8ff-4b76-8f3a-53725f9f318c", "address": "fa:16:3e:57:cb:7b", "network": {"id": "de376ae8-a0af-4eda-95e8-dfb1ca0f2283", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1261629972-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8acda82ef3f428fbb93847922a213d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd47b104-c8", "ovs_interfaceid": "fd47b104-c8ff-4b76-8f3a-53725f9f318c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Converting VIF {"id": "fd47b104-c8ff-4b76-8f3a-53725f9f318c", "address": "fa:16:3e:57:cb:7b", "network": {"id": "de376ae8-a0af-4eda-95e8-dfb1ca0f2283", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1261629972-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8acda82ef3f428fbb93847922a213d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd47b104-c8", "ovs_interfaceid": "fd47b104-c8ff-4b76-8f3a-53725f9f318c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:cb:7b,bridge_name='br-int',has_traffic_filtering=True,id=fd47b104-c8ff-4b76-8f3a-53725f9f318c,network=Network(de376ae8-a0af-4eda-95e8-dfb1ca0f2283),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd47b104-c8') {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG os_vif [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:cb:7b,bridge_name='br-int',has_traffic_filtering=True,id=fd47b104-c8ff-4b76-8f3a-53725f9f318c,network=Network(de376ae8-a0af-4eda-95e8-dfb1ca0f2283),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd47b104-c8') {{(pid=70374) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfd47b104-c8, may_exist=True, interface_attrs={}) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfd47b104-c8, col_values=(('external_ids', {'iface-id': 'fd47b104-c8ff-4b76-8f3a-53725f9f318c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:57:cb:7b', 'vm-uuid': 'fb5ccac9-1e45-4726-b681-cf34cf3fa521'}),), if_exists=True) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:20 user nova-compute[70374]: INFO os_vif [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:cb:7b,bridge_name='br-int',has_traffic_filtering=True,id=fd47b104-c8ff-4b76-8f3a-53725f9f318c,network=Network(de376ae8-a0af-4eda-95e8-dfb1ca0f2283),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd47b104-c8') Jul 27 09:30:20 user nova-compute[70374]: DEBUG nova.network.neutron [req-2bf1a339-43ce-4d5c-ac03-cf923f486ace req-c8843656-4cf7-4a3a-a9cb-845eccbe7e9e service nova] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Updated VIF entry in instance network info cache for port 98a9819c-4270-4b75-b851-635a3b19a7b4. {{(pid=70374) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG nova.network.neutron [req-2bf1a339-43ce-4d5c-ac03-cf923f486ace req-c8843656-4cf7-4a3a-a9cb-845eccbe7e9e service nova] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Updating instance_info_cache with network_info: [{"id": "98a9819c-4270-4b75-b851-635a3b19a7b4", "address": "fa:16:3e:de:fb:c0", "network": {"id": "6faafed3-97ab-43b5-bf25-0d0a468935f8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-754617940-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "0d99bef1aeee4c6090e60bdbb0ecbeda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap98a9819c-42", "ovs_interfaceid": "98a9819c-4270-4b75-b851-635a3b19a7b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-2bf1a339-43ce-4d5c-ac03-cf923f486ace req-c8843656-4cf7-4a3a-a9cb-845eccbe7e9e service nova] Releasing lock "refresh_cache-6a5593cd-3ab6-4859-b0fb-33a0ed702dc8" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] No BDM found with device name vda, not building metadata. {{(pid=70374) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] No VIF found with MAC fa:16:3e:57:cb:7b, not building metadata {{(pid=70374) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG nova.network.neutron [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Successfully created port: b825d9b4-15c4-4b47-a3f7-af9838d09458 {{(pid=70374) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:20 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.compute.manager [req-237e5cea-74df-4e12-9b83-39a6c4bb25bc req-778fa801-d737-4df4-abd6-68a114e18420 service nova] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Received event network-vif-plugged-6858719a-7fa4-4a64-ad8a-02c9274adb55 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-237e5cea-74df-4e12-9b83-39a6c4bb25bc req-778fa801-d737-4df4-abd6-68a114e18420 service nova] Acquiring lock "04a990b9-ed32-41e9-b384-f0886e8d1b49-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-237e5cea-74df-4e12-9b83-39a6c4bb25bc req-778fa801-d737-4df4-abd6-68a114e18420 service nova] Lock "04a990b9-ed32-41e9-b384-f0886e8d1b49-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-237e5cea-74df-4e12-9b83-39a6c4bb25bc req-778fa801-d737-4df4-abd6-68a114e18420 service nova] Lock "04a990b9-ed32-41e9-b384-f0886e8d1b49-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.compute.manager [req-237e5cea-74df-4e12-9b83-39a6c4bb25bc req-778fa801-d737-4df4-abd6-68a114e18420 service nova] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] No waiting events found dispatching network-vif-plugged-6858719a-7fa4-4a64-ad8a-02c9274adb55 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:30:21 user nova-compute[70374]: WARNING nova.compute.manager [req-237e5cea-74df-4e12-9b83-39a6c4bb25bc req-778fa801-d737-4df4-abd6-68a114e18420 service nova] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Received unexpected event network-vif-plugged-6858719a-7fa4-4a64-ad8a-02c9274adb55 for instance with vm_state building and task_state spawning. Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.network.neutron [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Updating instance_info_cache with network_info: [{"id": "ce2809b5-cc30-4a29-9770-7df186b16406", "address": "fa:16:3e:ef:0f:b2", "network": {"id": "de376ae8-a0af-4eda-95e8-dfb1ca0f2283", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1261629972-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8acda82ef3f428fbb93847922a213d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapce2809b5-cc", "ovs_interfaceid": "ce2809b5-cc30-4a29-9770-7df186b16406", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Releasing lock "refresh_cache-6ae93f34-ce7e-4ae4-a5ba-36508361bd54" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.compute.manager [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Instance network_info: |[{"id": "ce2809b5-cc30-4a29-9770-7df186b16406", "address": "fa:16:3e:ef:0f:b2", "network": {"id": "de376ae8-a0af-4eda-95e8-dfb1ca0f2283", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1261629972-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8acda82ef3f428fbb93847922a213d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapce2809b5-cc", "ovs_interfaceid": "ce2809b5-cc30-4a29-9770-7df186b16406", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70374) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-a765d3c0-bbef-48fa-b350-6a518a5ea414 req-28e37a0c-07b4-407a-bf17-9ba09dabc2e4 service nova] Acquired lock "refresh_cache-6ae93f34-ce7e-4ae4-a5ba-36508361bd54" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.network.neutron [req-a765d3c0-bbef-48fa-b350-6a518a5ea414 req-28e37a0c-07b4-407a-bf17-9ba09dabc2e4 service nova] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Refreshing network info cache for port ce2809b5-cc30-4a29-9770-7df186b16406 {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Start _get_guest_xml network_info=[{"id": "ce2809b5-cc30-4a29-9770-7df186b16406", "address": "fa:16:3e:ef:0f:b2", "network": {"id": "de376ae8-a0af-4eda-95e8-dfb1ca0f2283", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1261629972-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8acda82ef3f428fbb93847922a213d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapce2809b5-cc", "ovs_interfaceid": "ce2809b5-cc30-4a29-9770-7df186b16406", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-07-27T09:26:27Z,direct_url=,disk_format='qcow2',id=35458adf-261a-4e0b-a4db-b243619b2394,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='37aff788807d4b2aafc10d355583f7c7',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-07-27T09:26:29Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'size': 0, 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'image_id': '35458adf-261a-4e0b-a4db-b243619b2394'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70374) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.network.neutron [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Successfully updated port: e6dc311e-c8d0-4b0c-bf42-96919f9d6f02 {{(pid=70374) _update_port /opt/stack/nova/nova/network/neutron.py:586}} Jul 27 09:30:21 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:30:21 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:30:21 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Acquiring lock "refresh_cache-8640c525-e6ba-4bf8-9fe0-2c08155dd1cb" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Acquired lock "refresh_cache-8640c525-e6ba-4bf8-9fe0-2c08155dd1cb" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.network.neutron [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Building network info cache for instance {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Searching host: 'user' for CPU controller through CGroups V1... {{(pid=70374) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] CPU controller missing on host. {{(pid=70374) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Searching host: 'user' for CPU controller through CGroups V2... {{(pid=70374) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] CPU controller found on host. {{(pid=70374) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70374) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-07-27T09:26:16Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1',id=6,is_public=True,memory_mb=512,name='m1.tiny',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-07-27T09:26:27Z,direct_url=,disk_format='qcow2',id=35458adf-261a-4e0b-a4db-b243619b2394,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='37aff788807d4b2aafc10d355583f7c7',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-07-27T09:26:29Z,virtual_size=,visibility=), allow threads: True {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Flavor limits 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Image limits 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Flavor pref 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Image pref 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70374) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Got 1 possible topologies {{(pid=70374) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.virt.libvirt.vif [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-07-27T09:30:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1457979026',display_name='tempest-ServerRescueNegativeTestJSON-server-1457979026',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(6),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-1457979026',id=4,image_ref='35458adf-261a-4e0b-a4db-b243619b2394',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d8acda82ef3f428fbb93847922a213d1',ramdisk_id='',reservation_id='r-x508h1z9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='35458adf-261a-4e0b-a4db-b243619b2394',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-241559119',owner_user_name='tempest-ServerRescueNegativeTestJSON-241559119-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-07-27T09:30:13Z,user_data=None,user_id='d059c8dff3644f3c9c0c54498a4d78f7',uuid=6ae93f34-ce7e-4ae4-a5ba-36508361bd54,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ce2809b5-cc30-4a29-9770-7df186b16406", "address": "fa:16:3e:ef:0f:b2", "network": {"id": "de376ae8-a0af-4eda-95e8-dfb1ca0f2283", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1261629972-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8acda82ef3f428fbb93847922a213d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapce2809b5-cc", "ovs_interfaceid": "ce2809b5-cc30-4a29-9770-7df186b16406", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70374) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Converting VIF {"id": "ce2809b5-cc30-4a29-9770-7df186b16406", "address": "fa:16:3e:ef:0f:b2", "network": {"id": "de376ae8-a0af-4eda-95e8-dfb1ca0f2283", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1261629972-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8acda82ef3f428fbb93847922a213d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapce2809b5-cc", "ovs_interfaceid": "ce2809b5-cc30-4a29-9770-7df186b16406", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ef:0f:b2,bridge_name='br-int',has_traffic_filtering=True,id=ce2809b5-cc30-4a29-9770-7df186b16406,network=Network(de376ae8-a0af-4eda-95e8-dfb1ca0f2283),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce2809b5-cc') {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.objects.instance [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Lazy-loading 'pci_devices' on Instance uuid 6ae93f34-ce7e-4ae4-a5ba-36508361bd54 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.compute.manager [req-05c1b402-a195-437f-ba93-89117886ae9f req-67e1a680-8d9c-4dfa-80eb-4ff95b9abc8c service nova] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Received event network-changed-e6dc311e-c8d0-4b0c-bf42-96919f9d6f02 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.compute.manager [req-05c1b402-a195-437f-ba93-89117886ae9f req-67e1a680-8d9c-4dfa-80eb-4ff95b9abc8c service nova] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Refreshing instance network info cache due to event network-changed-e6dc311e-c8d0-4b0c-bf42-96919f9d6f02. {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-05c1b402-a195-437f-ba93-89117886ae9f req-67e1a680-8d9c-4dfa-80eb-4ff95b9abc8c service nova] Acquiring lock "refresh_cache-8640c525-e6ba-4bf8-9fe0-2c08155dd1cb" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] End _get_guest_xml xml= Jul 27 09:30:21 user nova-compute[70374]: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54 Jul 27 09:30:21 user nova-compute[70374]: instance-00000004 Jul 27 09:30:21 user nova-compute[70374]: 524288 Jul 27 09:30:21 user nova-compute[70374]: 1 Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: tempest-ServerRescueNegativeTestJSON-server-1457979026 Jul 27 09:30:21 user nova-compute[70374]: 2023-07-27 09:30:21 Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: 512 Jul 27 09:30:21 user nova-compute[70374]: 1 Jul 27 09:30:21 user nova-compute[70374]: 0 Jul 27 09:30:21 user nova-compute[70374]: 0 Jul 27 09:30:21 user nova-compute[70374]: 1 Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: tempest-ServerRescueNegativeTestJSON-241559119-project-member Jul 27 09:30:21 user nova-compute[70374]: tempest-ServerRescueNegativeTestJSON-241559119 Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: OpenStack Foundation Jul 27 09:30:21 user nova-compute[70374]: OpenStack Nova Jul 27 09:30:21 user nova-compute[70374]: 0.0.0 Jul 27 09:30:21 user nova-compute[70374]: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54 Jul 27 09:30:21 user nova-compute[70374]: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54 Jul 27 09:30:21 user nova-compute[70374]: Virtual Machine Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: hvm Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Nehalem Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: /dev/urandom Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: Jul 27 09:30:21 user nova-compute[70374]: {{(pid=70374) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.virt.libvirt.vif [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-07-27T09:30:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1457979026',display_name='tempest-ServerRescueNegativeTestJSON-server-1457979026',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(6),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-1457979026',id=4,image_ref='35458adf-261a-4e0b-a4db-b243619b2394',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d8acda82ef3f428fbb93847922a213d1',ramdisk_id='',reservation_id='r-x508h1z9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='35458adf-261a-4e0b-a4db-b243619b2394',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-241559119',owner_user_name='tempest-ServerRescueNegativeTestJSON-241559119-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-07-27T09:30:13Z,user_data=None,user_id='d059c8dff3644f3c9c0c54498a4d78f7',uuid=6ae93f34-ce7e-4ae4-a5ba-36508361bd54,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ce2809b5-cc30-4a29-9770-7df186b16406", "address": "fa:16:3e:ef:0f:b2", "network": {"id": "de376ae8-a0af-4eda-95e8-dfb1ca0f2283", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1261629972-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8acda82ef3f428fbb93847922a213d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapce2809b5-cc", "ovs_interfaceid": "ce2809b5-cc30-4a29-9770-7df186b16406", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Converting VIF {"id": "ce2809b5-cc30-4a29-9770-7df186b16406", "address": "fa:16:3e:ef:0f:b2", "network": {"id": "de376ae8-a0af-4eda-95e8-dfb1ca0f2283", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1261629972-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8acda82ef3f428fbb93847922a213d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapce2809b5-cc", "ovs_interfaceid": "ce2809b5-cc30-4a29-9770-7df186b16406", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ef:0f:b2,bridge_name='br-int',has_traffic_filtering=True,id=ce2809b5-cc30-4a29-9770-7df186b16406,network=Network(de376ae8-a0af-4eda-95e8-dfb1ca0f2283),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce2809b5-cc') {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG os_vif [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:0f:b2,bridge_name='br-int',has_traffic_filtering=True,id=ce2809b5-cc30-4a29-9770-7df186b16406,network=Network(de376ae8-a0af-4eda-95e8-dfb1ca0f2283),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce2809b5-cc') {{(pid=70374) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapce2809b5-cc, may_exist=True, interface_attrs={}) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapce2809b5-cc, col_values=(('external_ids', {'iface-id': 'ce2809b5-cc30-4a29-9770-7df186b16406', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ef:0f:b2', 'vm-uuid': '6ae93f34-ce7e-4ae4-a5ba-36508361bd54'}),), if_exists=True) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:21 user nova-compute[70374]: INFO os_vif [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:0f:b2,bridge_name='br-int',has_traffic_filtering=True,id=ce2809b5-cc30-4a29-9770-7df186b16406,network=Network(de376ae8-a0af-4eda-95e8-dfb1ca0f2283),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce2809b5-cc') Jul 27 09:30:21 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] No BDM found with device name vda, not building metadata. {{(pid=70374) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] No VIF found with MAC fa:16:3e:ef:0f:b2, not building metadata {{(pid=70374) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.network.neutron [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Instance cache missing network info. {{(pid=70374) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.network.neutron [req-eb38fdd4-8a1b-4d35-9f46-904a861b2f22 req-96424af7-e05d-4f97-bd1a-c30984cfd118 service nova] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Updated VIF entry in instance network info cache for port fd47b104-c8ff-4b76-8f3a-53725f9f318c. {{(pid=70374) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG nova.network.neutron [req-eb38fdd4-8a1b-4d35-9f46-904a861b2f22 req-96424af7-e05d-4f97-bd1a-c30984cfd118 service nova] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Updating instance_info_cache with network_info: [{"id": "fd47b104-c8ff-4b76-8f3a-53725f9f318c", "address": "fa:16:3e:57:cb:7b", "network": {"id": "de376ae8-a0af-4eda-95e8-dfb1ca0f2283", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1261629972-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8acda82ef3f428fbb93847922a213d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd47b104-c8", "ovs_interfaceid": "fd47b104-c8ff-4b76-8f3a-53725f9f318c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:30:21 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-eb38fdd4-8a1b-4d35-9f46-904a861b2f22 req-96424af7-e05d-4f97-bd1a-c30984cfd118 service nova] Releasing lock "refresh_cache-fb5ccac9-1e45-4726-b681-cf34cf3fa521" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG nova.compute.manager [req-8a03e8da-0ed8-4041-8b48-5fcaa38b9fea req-d51c763c-4d90-4c1b-b39b-b67516afc1e3 service nova] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Received event network-vif-plugged-98a9819c-4270-4b75-b851-635a3b19a7b4 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-8a03e8da-0ed8-4041-8b48-5fcaa38b9fea req-d51c763c-4d90-4c1b-b39b-b67516afc1e3 service nova] Acquiring lock "6a5593cd-3ab6-4859-b0fb-33a0ed702dc8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-8a03e8da-0ed8-4041-8b48-5fcaa38b9fea req-d51c763c-4d90-4c1b-b39b-b67516afc1e3 service nova] Lock "6a5593cd-3ab6-4859-b0fb-33a0ed702dc8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-8a03e8da-0ed8-4041-8b48-5fcaa38b9fea req-d51c763c-4d90-4c1b-b39b-b67516afc1e3 service nova] Lock "6a5593cd-3ab6-4859-b0fb-33a0ed702dc8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG nova.compute.manager [req-8a03e8da-0ed8-4041-8b48-5fcaa38b9fea req-d51c763c-4d90-4c1b-b39b-b67516afc1e3 service nova] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] No waiting events found dispatching network-vif-plugged-98a9819c-4270-4b75-b851-635a3b19a7b4 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:30:22 user nova-compute[70374]: WARNING nova.compute.manager [req-8a03e8da-0ed8-4041-8b48-5fcaa38b9fea req-d51c763c-4d90-4c1b-b39b-b67516afc1e3 service nova] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Received unexpected event network-vif-plugged-98a9819c-4270-4b75-b851-635a3b19a7b4 for instance with vm_state building and task_state spawning. Jul 27 09:30:22 user nova-compute[70374]: DEBUG nova.network.neutron [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Updating instance_info_cache with network_info: [{"id": "e6dc311e-c8d0-4b0c-bf42-96919f9d6f02", "address": "fa:16:3e:21:44:72", "network": {"id": "fb617df7-46b2-48ba-beb1-29eefa43aa47", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-96492016-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bfaa69b3745a435795aa636ccddec3af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape6dc311e-c8", "ovs_interfaceid": "e6dc311e-c8d0-4b0c-bf42-96919f9d6f02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Releasing lock "refresh_cache-8640c525-e6ba-4bf8-9fe0-2c08155dd1cb" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG nova.compute.manager [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Instance network_info: |[{"id": "e6dc311e-c8d0-4b0c-bf42-96919f9d6f02", "address": "fa:16:3e:21:44:72", "network": {"id": "fb617df7-46b2-48ba-beb1-29eefa43aa47", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-96492016-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bfaa69b3745a435795aa636ccddec3af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape6dc311e-c8", "ovs_interfaceid": "e6dc311e-c8d0-4b0c-bf42-96919f9d6f02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70374) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-05c1b402-a195-437f-ba93-89117886ae9f req-67e1a680-8d9c-4dfa-80eb-4ff95b9abc8c service nova] Acquired lock "refresh_cache-8640c525-e6ba-4bf8-9fe0-2c08155dd1cb" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG nova.network.neutron [req-05c1b402-a195-437f-ba93-89117886ae9f req-67e1a680-8d9c-4dfa-80eb-4ff95b9abc8c service nova] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Refreshing network info cache for port e6dc311e-c8d0-4b0c-bf42-96919f9d6f02 {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Start _get_guest_xml network_info=[{"id": "e6dc311e-c8d0-4b0c-bf42-96919f9d6f02", "address": "fa:16:3e:21:44:72", "network": {"id": "fb617df7-46b2-48ba-beb1-29eefa43aa47", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-96492016-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bfaa69b3745a435795aa636ccddec3af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape6dc311e-c8", "ovs_interfaceid": "e6dc311e-c8d0-4b0c-bf42-96919f9d6f02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-07-27T09:26:27Z,direct_url=,disk_format='qcow2',id=35458adf-261a-4e0b-a4db-b243619b2394,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='37aff788807d4b2aafc10d355583f7c7',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-07-27T09:26:29Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'size': 0, 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'image_id': '35458adf-261a-4e0b-a4db-b243619b2394'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70374) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Jul 27 09:30:22 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:30:22 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:30:22 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Searching host: 'user' for CPU controller through CGroups V1... {{(pid=70374) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] CPU controller missing on host. {{(pid=70374) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Searching host: 'user' for CPU controller through CGroups V2... {{(pid=70374) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] CPU controller found on host. {{(pid=70374) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70374) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-07-27T09:26:16Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1',id=6,is_public=True,memory_mb=512,name='m1.tiny',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-07-27T09:26:27Z,direct_url=,disk_format='qcow2',id=35458adf-261a-4e0b-a4db-b243619b2394,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='37aff788807d4b2aafc10d355583f7c7',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-07-27T09:26:29Z,virtual_size=,visibility=), allow threads: True {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Flavor limits 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Image limits 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Flavor pref 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Image pref 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70374) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Got 1 possible topologies {{(pid=70374) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG nova.virt.libvirt.vif [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-07-27T09:30:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1273462667',display_name='tempest-AttachVolumeNegativeTest-server-1273462667',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(6),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-1273462667',id=5,image_ref='35458adf-261a-4e0b-a4db-b243619b2394',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIGY2O+H9PqW27GcPB7HFQV9b1fArRsb4Q62SdnMok2xNMJx4Ndfl41fCfu4LIuZgNm69n1GAW5tYyT6DlwoIS80yrefNCzUIO8hhqDUSVPM5InUIY/dZEfr1Hu+cJST4A==',key_name='tempest-keypair-1385781482',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bfaa69b3745a435795aa636ccddec3af',ramdisk_id='',reservation_id='r-ocp00dmy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='35458adf-261a-4e0b-a4db-b243619b2394',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-305502590',owner_user_name='tempest-AttachVolumeNegativeTest-305502590-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-07-27T09:30:16Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='4c1d49c23b0045d881d9f47e33447162',uuid=8640c525-e6ba-4bf8-9fe0-2c08155dd1cb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e6dc311e-c8d0-4b0c-bf42-96919f9d6f02", "address": "fa:16:3e:21:44:72", "network": {"id": "fb617df7-46b2-48ba-beb1-29eefa43aa47", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-96492016-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bfaa69b3745a435795aa636ccddec3af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape6dc311e-c8", "ovs_interfaceid": "e6dc311e-c8d0-4b0c-bf42-96919f9d6f02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70374) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Converting VIF {"id": "e6dc311e-c8d0-4b0c-bf42-96919f9d6f02", "address": "fa:16:3e:21:44:72", "network": {"id": "fb617df7-46b2-48ba-beb1-29eefa43aa47", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-96492016-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bfaa69b3745a435795aa636ccddec3af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape6dc311e-c8", "ovs_interfaceid": "e6dc311e-c8d0-4b0c-bf42-96919f9d6f02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:21:44:72,bridge_name='br-int',has_traffic_filtering=True,id=e6dc311e-c8d0-4b0c-bf42-96919f9d6f02,network=Network(fb617df7-46b2-48ba-beb1-29eefa43aa47),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6dc311e-c8') {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG nova.objects.instance [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Lazy-loading 'pci_devices' on Instance uuid 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] End _get_guest_xml xml= Jul 27 09:30:22 user nova-compute[70374]: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb Jul 27 09:30:22 user nova-compute[70374]: instance-00000005 Jul 27 09:30:22 user nova-compute[70374]: 524288 Jul 27 09:30:22 user nova-compute[70374]: 1 Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: tempest-AttachVolumeNegativeTest-server-1273462667 Jul 27 09:30:22 user nova-compute[70374]: 2023-07-27 09:30:22 Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: 512 Jul 27 09:30:22 user nova-compute[70374]: 1 Jul 27 09:30:22 user nova-compute[70374]: 0 Jul 27 09:30:22 user nova-compute[70374]: 0 Jul 27 09:30:22 user nova-compute[70374]: 1 Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: tempest-AttachVolumeNegativeTest-305502590-project-member Jul 27 09:30:22 user nova-compute[70374]: tempest-AttachVolumeNegativeTest-305502590 Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: OpenStack Foundation Jul 27 09:30:22 user nova-compute[70374]: OpenStack Nova Jul 27 09:30:22 user nova-compute[70374]: 0.0.0 Jul 27 09:30:22 user nova-compute[70374]: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb Jul 27 09:30:22 user nova-compute[70374]: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb Jul 27 09:30:22 user nova-compute[70374]: Virtual Machine Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: hvm Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Nehalem Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: /dev/urandom Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: Jul 27 09:30:22 user nova-compute[70374]: {{(pid=70374) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG nova.virt.libvirt.vif [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-07-27T09:30:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1273462667',display_name='tempest-AttachVolumeNegativeTest-server-1273462667',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(6),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-1273462667',id=5,image_ref='35458adf-261a-4e0b-a4db-b243619b2394',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIGY2O+H9PqW27GcPB7HFQV9b1fArRsb4Q62SdnMok2xNMJx4Ndfl41fCfu4LIuZgNm69n1GAW5tYyT6DlwoIS80yrefNCzUIO8hhqDUSVPM5InUIY/dZEfr1Hu+cJST4A==',key_name='tempest-keypair-1385781482',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bfaa69b3745a435795aa636ccddec3af',ramdisk_id='',reservation_id='r-ocp00dmy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='35458adf-261a-4e0b-a4db-b243619b2394',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-305502590',owner_user_name='tempest-AttachVolumeNegativeTest-305502590-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-07-27T09:30:16Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='4c1d49c23b0045d881d9f47e33447162',uuid=8640c525-e6ba-4bf8-9fe0-2c08155dd1cb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e6dc311e-c8d0-4b0c-bf42-96919f9d6f02", "address": "fa:16:3e:21:44:72", "network": {"id": "fb617df7-46b2-48ba-beb1-29eefa43aa47", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-96492016-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bfaa69b3745a435795aa636ccddec3af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape6dc311e-c8", "ovs_interfaceid": "e6dc311e-c8d0-4b0c-bf42-96919f9d6f02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Converting VIF {"id": "e6dc311e-c8d0-4b0c-bf42-96919f9d6f02", "address": "fa:16:3e:21:44:72", "network": {"id": "fb617df7-46b2-48ba-beb1-29eefa43aa47", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-96492016-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bfaa69b3745a435795aa636ccddec3af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape6dc311e-c8", "ovs_interfaceid": "e6dc311e-c8d0-4b0c-bf42-96919f9d6f02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:21:44:72,bridge_name='br-int',has_traffic_filtering=True,id=e6dc311e-c8d0-4b0c-bf42-96919f9d6f02,network=Network(fb617df7-46b2-48ba-beb1-29eefa43aa47),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6dc311e-c8') {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG os_vif [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:21:44:72,bridge_name='br-int',has_traffic_filtering=True,id=e6dc311e-c8d0-4b0c-bf42-96919f9d6f02,network=Network(fb617df7-46b2-48ba-beb1-29eefa43aa47),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6dc311e-c8') {{(pid=70374) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape6dc311e-c8, may_exist=True, interface_attrs={}) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape6dc311e-c8, col_values=(('external_ids', {'iface-id': 'e6dc311e-c8d0-4b0c-bf42-96919f9d6f02', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:21:44:72', 'vm-uuid': '8640c525-e6ba-4bf8-9fe0-2c08155dd1cb'}),), if_exists=True) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:22 user nova-compute[70374]: INFO os_vif [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:21:44:72,bridge_name='br-int',has_traffic_filtering=True,id=e6dc311e-c8d0-4b0c-bf42-96919f9d6f02,network=Network(fb617df7-46b2-48ba-beb1-29eefa43aa47),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6dc311e-c8') Jul 27 09:30:22 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] No BDM found with device name vda, not building metadata. {{(pid=70374) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Jul 27 09:30:22 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] No VIF found with MAC fa:16:3e:21:44:72, not building metadata {{(pid=70374) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Jul 27 09:30:23 user nova-compute[70374]: DEBUG nova.network.neutron [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Successfully updated port: b825d9b4-15c4-4b47-a3f7-af9838d09458 {{(pid=70374) _update_port /opt/stack/nova/nova/network/neutron.py:586}} Jul 27 09:30:23 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Acquiring lock "refresh_cache-25214e8a-c626-46a7-b273-eb491c2fc91b" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:30:23 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Acquired lock "refresh_cache-25214e8a-c626-46a7-b273-eb491c2fc91b" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:30:23 user nova-compute[70374]: DEBUG nova.network.neutron [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Building network info cache for instance {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Jul 27 09:30:23 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:23 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:23 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:23 user nova-compute[70374]: DEBUG nova.compute.manager [req-348e73a8-1b23-4c8c-8555-8c3c88170049 req-1502df48-8fd1-4041-b150-726b2fda4662 service nova] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Received event network-vif-plugged-6858719a-7fa4-4a64-ad8a-02c9274adb55 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:30:23 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-348e73a8-1b23-4c8c-8555-8c3c88170049 req-1502df48-8fd1-4041-b150-726b2fda4662 service nova] Acquiring lock "04a990b9-ed32-41e9-b384-f0886e8d1b49-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:23 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-348e73a8-1b23-4c8c-8555-8c3c88170049 req-1502df48-8fd1-4041-b150-726b2fda4662 service nova] Lock "04a990b9-ed32-41e9-b384-f0886e8d1b49-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:23 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-348e73a8-1b23-4c8c-8555-8c3c88170049 req-1502df48-8fd1-4041-b150-726b2fda4662 service nova] Lock "04a990b9-ed32-41e9-b384-f0886e8d1b49-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:23 user nova-compute[70374]: DEBUG nova.compute.manager [req-348e73a8-1b23-4c8c-8555-8c3c88170049 req-1502df48-8fd1-4041-b150-726b2fda4662 service nova] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] No waiting events found dispatching network-vif-plugged-6858719a-7fa4-4a64-ad8a-02c9274adb55 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:30:23 user nova-compute[70374]: WARNING nova.compute.manager [req-348e73a8-1b23-4c8c-8555-8c3c88170049 req-1502df48-8fd1-4041-b150-726b2fda4662 service nova] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Received unexpected event network-vif-plugged-6858719a-7fa4-4a64-ad8a-02c9274adb55 for instance with vm_state building and task_state spawning. Jul 27 09:30:23 user nova-compute[70374]: DEBUG nova.network.neutron [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Instance cache missing network info. {{(pid=70374) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Jul 27 09:30:23 user nova-compute[70374]: DEBUG nova.network.neutron [req-a765d3c0-bbef-48fa-b350-6a518a5ea414 req-28e37a0c-07b4-407a-bf17-9ba09dabc2e4 service nova] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Updated VIF entry in instance network info cache for port ce2809b5-cc30-4a29-9770-7df186b16406. {{(pid=70374) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Jul 27 09:30:23 user nova-compute[70374]: DEBUG nova.network.neutron [req-a765d3c0-bbef-48fa-b350-6a518a5ea414 req-28e37a0c-07b4-407a-bf17-9ba09dabc2e4 service nova] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Updating instance_info_cache with network_info: [{"id": "ce2809b5-cc30-4a29-9770-7df186b16406", "address": "fa:16:3e:ef:0f:b2", "network": {"id": "de376ae8-a0af-4eda-95e8-dfb1ca0f2283", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1261629972-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8acda82ef3f428fbb93847922a213d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapce2809b5-cc", "ovs_interfaceid": "ce2809b5-cc30-4a29-9770-7df186b16406", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:30:23 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-a765d3c0-bbef-48fa-b350-6a518a5ea414 req-28e37a0c-07b4-407a-bf17-9ba09dabc2e4 service nova] Releasing lock "refresh_cache-6ae93f34-ce7e-4ae4-a5ba-36508361bd54" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:30:23 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:23 user nova-compute[70374]: DEBUG nova.compute.manager [req-48ad7955-1a31-46c4-9493-6247e752746e req-698b72d5-5340-42a8-b20f-d23d5ca9e1d7 service nova] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Received event network-changed-b825d9b4-15c4-4b47-a3f7-af9838d09458 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:30:23 user nova-compute[70374]: DEBUG nova.compute.manager [req-48ad7955-1a31-46c4-9493-6247e752746e req-698b72d5-5340-42a8-b20f-d23d5ca9e1d7 service nova] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Refreshing instance network info cache due to event network-changed-b825d9b4-15c4-4b47-a3f7-af9838d09458. {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Jul 27 09:30:23 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-48ad7955-1a31-46c4-9493-6247e752746e req-698b72d5-5340-42a8-b20f-d23d5ca9e1d7 service nova] Acquiring lock "refresh_cache-25214e8a-c626-46a7-b273-eb491c2fc91b" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:30:23 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:23 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:23 user nova-compute[70374]: DEBUG nova.virt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Emitting event Resumed> {{(pid=70374) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Jul 27 09:30:23 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] VM Resumed (Lifecycle Event) Jul 27 09:30:23 user nova-compute[70374]: DEBUG nova.compute.manager [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Instance event wait completed in 0 seconds for {{(pid=70374) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Jul 27 09:30:23 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Guest created on hypervisor {{(pid=70374) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4411}} Jul 27 09:30:23 user nova-compute[70374]: INFO nova.virt.libvirt.driver [-] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Instance spawned successfully. Jul 27 09:30:23 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:894}} Jul 27 09:30:23 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:23 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:30:23 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Found default for hw_cdrom_bus of ide {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:23 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Found default for hw_disk_bus of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:23 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Found default for hw_input_bus of None {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:23 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Found default for hw_pointer_model of None {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:23 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Found default for hw_video_model of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:23 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Found default for hw_vif_model of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:23 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70374) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Jul 27 09:30:24 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] During sync_power_state the instance has a pending task (spawning). Skip. Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.virt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Emitting event Started> {{(pid=70374) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Jul 27 09:30:24 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] VM Started (Lifecycle Event) Jul 27 09:30:24 user nova-compute[70374]: INFO nova.compute.manager [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Took 16.35 seconds to spawn the instance on the hypervisor. Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.compute.manager [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70374) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.network.neutron [req-05c1b402-a195-437f-ba93-89117886ae9f req-67e1a680-8d9c-4dfa-80eb-4ff95b9abc8c service nova] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Updated VIF entry in instance network info cache for port e6dc311e-c8d0-4b0c-bf42-96919f9d6f02. {{(pid=70374) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.network.neutron [req-05c1b402-a195-437f-ba93-89117886ae9f req-67e1a680-8d9c-4dfa-80eb-4ff95b9abc8c service nova] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Updating instance_info_cache with network_info: [{"id": "e6dc311e-c8d0-4b0c-bf42-96919f9d6f02", "address": "fa:16:3e:21:44:72", "network": {"id": "fb617df7-46b2-48ba-beb1-29eefa43aa47", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-96492016-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bfaa69b3745a435795aa636ccddec3af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape6dc311e-c8", "ovs_interfaceid": "e6dc311e-c8d0-4b0c-bf42-96919f9d6f02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-05c1b402-a195-437f-ba93-89117886ae9f req-67e1a680-8d9c-4dfa-80eb-4ff95b9abc8c service nova] Releasing lock "refresh_cache-8640c525-e6ba-4bf8-9fe0-2c08155dd1cb" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:30:24 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] During sync_power_state the instance has a pending task (spawning). Skip. Jul 27 09:30:24 user nova-compute[70374]: INFO nova.compute.manager [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Took 17.25 seconds to build instance. Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.compute.manager [req-e9cedcb7-0e6d-448a-bfb6-8dcc93256d29 req-59f6d9bf-3be9-4c18-860d-cff5b53f3d64 service nova] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Received event network-vif-plugged-98a9819c-4270-4b75-b851-635a3b19a7b4 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-e9cedcb7-0e6d-448a-bfb6-8dcc93256d29 req-59f6d9bf-3be9-4c18-860d-cff5b53f3d64 service nova] Acquiring lock "6a5593cd-3ab6-4859-b0fb-33a0ed702dc8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-e9cedcb7-0e6d-448a-bfb6-8dcc93256d29 req-59f6d9bf-3be9-4c18-860d-cff5b53f3d64 service nova] Lock "6a5593cd-3ab6-4859-b0fb-33a0ed702dc8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-e9cedcb7-0e6d-448a-bfb6-8dcc93256d29 req-59f6d9bf-3be9-4c18-860d-cff5b53f3d64 service nova] Lock "6a5593cd-3ab6-4859-b0fb-33a0ed702dc8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.compute.manager [req-e9cedcb7-0e6d-448a-bfb6-8dcc93256d29 req-59f6d9bf-3be9-4c18-860d-cff5b53f3d64 service nova] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] No waiting events found dispatching network-vif-plugged-98a9819c-4270-4b75-b851-635a3b19a7b4 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:30:24 user nova-compute[70374]: WARNING nova.compute.manager [req-e9cedcb7-0e6d-448a-bfb6-8dcc93256d29 req-59f6d9bf-3be9-4c18-860d-cff5b53f3d64 service nova] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Received unexpected event network-vif-plugged-98a9819c-4270-4b75-b851-635a3b19a7b4 for instance with vm_state building and task_state spawning. Jul 27 09:30:24 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-2ec26956-f7ca-4dbf-816c-eb88f6250379 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Lock "04a990b9-ed32-41e9-b384-f0886e8d1b49" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 17.446s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.network.neutron [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Updating instance_info_cache with network_info: [{"id": "b825d9b4-15c4-4b47-a3f7-af9838d09458", "address": "fa:16:3e:b2:af:a6", "network": {"id": "b51b0e60-de31-47a5-908f-36f76e9fa620", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-754390872-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "0592f0be670742a181e24823955f378b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb825d9b4-15", "ovs_interfaceid": "b825d9b4-15c4-4b47-a3f7-af9838d09458", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Releasing lock "refresh_cache-25214e8a-c626-46a7-b273-eb491c2fc91b" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.compute.manager [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Instance network_info: |[{"id": "b825d9b4-15c4-4b47-a3f7-af9838d09458", "address": "fa:16:3e:b2:af:a6", "network": {"id": "b51b0e60-de31-47a5-908f-36f76e9fa620", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-754390872-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "0592f0be670742a181e24823955f378b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb825d9b4-15", "ovs_interfaceid": "b825d9b4-15c4-4b47-a3f7-af9838d09458", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70374) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-48ad7955-1a31-46c4-9493-6247e752746e req-698b72d5-5340-42a8-b20f-d23d5ca9e1d7 service nova] Acquired lock "refresh_cache-25214e8a-c626-46a7-b273-eb491c2fc91b" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.network.neutron [req-48ad7955-1a31-46c4-9493-6247e752746e req-698b72d5-5340-42a8-b20f-d23d5ca9e1d7 service nova] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Refreshing network info cache for port b825d9b4-15c4-4b47-a3f7-af9838d09458 {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Start _get_guest_xml network_info=[{"id": "b825d9b4-15c4-4b47-a3f7-af9838d09458", "address": "fa:16:3e:b2:af:a6", "network": {"id": "b51b0e60-de31-47a5-908f-36f76e9fa620", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-754390872-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "0592f0be670742a181e24823955f378b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb825d9b4-15", "ovs_interfaceid": "b825d9b4-15c4-4b47-a3f7-af9838d09458", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-07-27T09:26:27Z,direct_url=,disk_format='qcow2',id=35458adf-261a-4e0b-a4db-b243619b2394,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='37aff788807d4b2aafc10d355583f7c7',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-07-27T09:26:29Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'size': 0, 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'image_id': '35458adf-261a-4e0b-a4db-b243619b2394'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70374) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Jul 27 09:30:24 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:30:24 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Searching host: 'user' for CPU controller through CGroups V1... {{(pid=70374) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] CPU controller missing on host. {{(pid=70374) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Searching host: 'user' for CPU controller through CGroups V2... {{(pid=70374) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] CPU controller found on host. {{(pid=70374) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70374) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-07-27T09:26:16Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1',id=6,is_public=True,memory_mb=512,name='m1.tiny',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-07-27T09:26:27Z,direct_url=,disk_format='qcow2',id=35458adf-261a-4e0b-a4db-b243619b2394,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='37aff788807d4b2aafc10d355583f7c7',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-07-27T09:26:29Z,virtual_size=,visibility=), allow threads: True {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Flavor limits 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Image limits 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Flavor pref 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Image pref 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70374) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Got 1 possible topologies {{(pid=70374) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.virt.libvirt.vif [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-07-27T09:30:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-1408375692',display_name='tempest-AttachVolumeTestJSON-server-1408375692',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(6),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-1408375692',id=6,image_ref='35458adf-261a-4e0b-a4db-b243619b2394',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH4a+uuDJdqeVGpMRf7PQlRSay+NAyjm/sX+4sRLANkXhl666R+2mNTJErxc9gBj9CSDiBIOY84BLAEFeJbf+9HEQDaVHrP8hjj5t2eksYyAEX3isjtLYaCVXeF4Zy8Ywg==',key_name='tempest-keypair-1085396211',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0592f0be670742a181e24823955f378b',ramdisk_id='',reservation_id='r-kqq9tpu3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='35458adf-261a-4e0b-a4db-b243619b2394',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-847846253',owner_user_name='tempest-AttachVolumeTestJSON-847846253-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-07-27T09:30:17Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9d4c5ec795ee4d3387ce8718d2ac67e0',uuid=25214e8a-c626-46a7-b273-eb491c2fc91b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b825d9b4-15c4-4b47-a3f7-af9838d09458", "address": "fa:16:3e:b2:af:a6", "network": {"id": "b51b0e60-de31-47a5-908f-36f76e9fa620", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-754390872-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "0592f0be670742a181e24823955f378b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb825d9b4-15", "ovs_interfaceid": "b825d9b4-15c4-4b47-a3f7-af9838d09458", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70374) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Converting VIF {"id": "b825d9b4-15c4-4b47-a3f7-af9838d09458", "address": "fa:16:3e:b2:af:a6", "network": {"id": "b51b0e60-de31-47a5-908f-36f76e9fa620", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-754390872-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "0592f0be670742a181e24823955f378b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb825d9b4-15", "ovs_interfaceid": "b825d9b4-15c4-4b47-a3f7-af9838d09458", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b2:af:a6,bridge_name='br-int',has_traffic_filtering=True,id=b825d9b4-15c4-4b47-a3f7-af9838d09458,network=Network(b51b0e60-de31-47a5-908f-36f76e9fa620),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb825d9b4-15') {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.objects.instance [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Lazy-loading 'pci_devices' on Instance uuid 25214e8a-c626-46a7-b273-eb491c2fc91b {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] End _get_guest_xml xml= Jul 27 09:30:24 user nova-compute[70374]: 25214e8a-c626-46a7-b273-eb491c2fc91b Jul 27 09:30:24 user nova-compute[70374]: instance-00000006 Jul 27 09:30:24 user nova-compute[70374]: 524288 Jul 27 09:30:24 user nova-compute[70374]: 1 Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: tempest-AttachVolumeTestJSON-server-1408375692 Jul 27 09:30:24 user nova-compute[70374]: 2023-07-27 09:30:24 Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: 512 Jul 27 09:30:24 user nova-compute[70374]: 1 Jul 27 09:30:24 user nova-compute[70374]: 0 Jul 27 09:30:24 user nova-compute[70374]: 0 Jul 27 09:30:24 user nova-compute[70374]: 1 Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: tempest-AttachVolumeTestJSON-847846253-project-member Jul 27 09:30:24 user nova-compute[70374]: tempest-AttachVolumeTestJSON-847846253 Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: OpenStack Foundation Jul 27 09:30:24 user nova-compute[70374]: OpenStack Nova Jul 27 09:30:24 user nova-compute[70374]: 0.0.0 Jul 27 09:30:24 user nova-compute[70374]: 25214e8a-c626-46a7-b273-eb491c2fc91b Jul 27 09:30:24 user nova-compute[70374]: 25214e8a-c626-46a7-b273-eb491c2fc91b Jul 27 09:30:24 user nova-compute[70374]: Virtual Machine Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: hvm Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Nehalem Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: /dev/urandom Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: Jul 27 09:30:24 user nova-compute[70374]: {{(pid=70374) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.virt.libvirt.vif [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-07-27T09:30:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-1408375692',display_name='tempest-AttachVolumeTestJSON-server-1408375692',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(6),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-1408375692',id=6,image_ref='35458adf-261a-4e0b-a4db-b243619b2394',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH4a+uuDJdqeVGpMRf7PQlRSay+NAyjm/sX+4sRLANkXhl666R+2mNTJErxc9gBj9CSDiBIOY84BLAEFeJbf+9HEQDaVHrP8hjj5t2eksYyAEX3isjtLYaCVXeF4Zy8Ywg==',key_name='tempest-keypair-1085396211',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0592f0be670742a181e24823955f378b',ramdisk_id='',reservation_id='r-kqq9tpu3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='35458adf-261a-4e0b-a4db-b243619b2394',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-847846253',owner_user_name='tempest-AttachVolumeTestJSON-847846253-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-07-27T09:30:17Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9d4c5ec795ee4d3387ce8718d2ac67e0',uuid=25214e8a-c626-46a7-b273-eb491c2fc91b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b825d9b4-15c4-4b47-a3f7-af9838d09458", "address": "fa:16:3e:b2:af:a6", "network": {"id": "b51b0e60-de31-47a5-908f-36f76e9fa620", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-754390872-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "0592f0be670742a181e24823955f378b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb825d9b4-15", "ovs_interfaceid": "b825d9b4-15c4-4b47-a3f7-af9838d09458", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Converting VIF {"id": "b825d9b4-15c4-4b47-a3f7-af9838d09458", "address": "fa:16:3e:b2:af:a6", "network": {"id": "b51b0e60-de31-47a5-908f-36f76e9fa620", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-754390872-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "0592f0be670742a181e24823955f378b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb825d9b4-15", "ovs_interfaceid": "b825d9b4-15c4-4b47-a3f7-af9838d09458", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b2:af:a6,bridge_name='br-int',has_traffic_filtering=True,id=b825d9b4-15c4-4b47-a3f7-af9838d09458,network=Network(b51b0e60-de31-47a5-908f-36f76e9fa620),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb825d9b4-15') {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG os_vif [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:af:a6,bridge_name='br-int',has_traffic_filtering=True,id=b825d9b4-15c4-4b47-a3f7-af9838d09458,network=Network(b51b0e60-de31-47a5-908f-36f76e9fa620),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb825d9b4-15') {{(pid=70374) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb825d9b4-15, may_exist=True, interface_attrs={}) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb825d9b4-15, col_values=(('external_ids', {'iface-id': 'b825d9b4-15c4-4b47-a3f7-af9838d09458', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b2:af:a6', 'vm-uuid': '25214e8a-c626-46a7-b273-eb491c2fc91b'}),), if_exists=True) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:24 user nova-compute[70374]: INFO os_vif [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:af:a6,bridge_name='br-int',has_traffic_filtering=True,id=b825d9b4-15c4-4b47-a3f7-af9838d09458,network=Network(b51b0e60-de31-47a5-908f-36f76e9fa620),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb825d9b4-15') Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] No BDM found with device name vda, not building metadata. {{(pid=70374) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Jul 27 09:30:24 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] No VIF found with MAC fa:16:3e:b2:af:a6, not building metadata {{(pid=70374) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Jul 27 09:30:25 user nova-compute[70374]: DEBUG nova.compute.manager [req-fe2e302b-3b25-4fb7-972e-10364157a376 req-87f6761a-ddfc-49da-81f6-5591c7c5a56b service nova] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Received event network-vif-plugged-fd47b104-c8ff-4b76-8f3a-53725f9f318c {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:30:25 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-fe2e302b-3b25-4fb7-972e-10364157a376 req-87f6761a-ddfc-49da-81f6-5591c7c5a56b service nova] Acquiring lock "fb5ccac9-1e45-4726-b681-cf34cf3fa521-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:25 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-fe2e302b-3b25-4fb7-972e-10364157a376 req-87f6761a-ddfc-49da-81f6-5591c7c5a56b service nova] Lock "fb5ccac9-1e45-4726-b681-cf34cf3fa521-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:25 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-fe2e302b-3b25-4fb7-972e-10364157a376 req-87f6761a-ddfc-49da-81f6-5591c7c5a56b service nova] Lock "fb5ccac9-1e45-4726-b681-cf34cf3fa521-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:25 user nova-compute[70374]: DEBUG nova.compute.manager [req-fe2e302b-3b25-4fb7-972e-10364157a376 req-87f6761a-ddfc-49da-81f6-5591c7c5a56b service nova] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] No waiting events found dispatching network-vif-plugged-fd47b104-c8ff-4b76-8f3a-53725f9f318c {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:30:25 user nova-compute[70374]: WARNING nova.compute.manager [req-fe2e302b-3b25-4fb7-972e-10364157a376 req-87f6761a-ddfc-49da-81f6-5591c7c5a56b service nova] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Received unexpected event network-vif-plugged-fd47b104-c8ff-4b76-8f3a-53725f9f318c for instance with vm_state building and task_state spawning. Jul 27 09:30:25 user nova-compute[70374]: DEBUG nova.compute.manager [req-fe2e302b-3b25-4fb7-972e-10364157a376 req-87f6761a-ddfc-49da-81f6-5591c7c5a56b service nova] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Received event network-vif-plugged-fd47b104-c8ff-4b76-8f3a-53725f9f318c {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:30:25 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-fe2e302b-3b25-4fb7-972e-10364157a376 req-87f6761a-ddfc-49da-81f6-5591c7c5a56b service nova] Acquiring lock "fb5ccac9-1e45-4726-b681-cf34cf3fa521-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:25 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-fe2e302b-3b25-4fb7-972e-10364157a376 req-87f6761a-ddfc-49da-81f6-5591c7c5a56b service nova] Lock "fb5ccac9-1e45-4726-b681-cf34cf3fa521-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:25 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-fe2e302b-3b25-4fb7-972e-10364157a376 req-87f6761a-ddfc-49da-81f6-5591c7c5a56b service nova] Lock "fb5ccac9-1e45-4726-b681-cf34cf3fa521-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:25 user nova-compute[70374]: DEBUG nova.compute.manager [req-fe2e302b-3b25-4fb7-972e-10364157a376 req-87f6761a-ddfc-49da-81f6-5591c7c5a56b service nova] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] No waiting events found dispatching network-vif-plugged-fd47b104-c8ff-4b76-8f3a-53725f9f318c {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:30:25 user nova-compute[70374]: WARNING nova.compute.manager [req-fe2e302b-3b25-4fb7-972e-10364157a376 req-87f6761a-ddfc-49da-81f6-5591c7c5a56b service nova] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Received unexpected event network-vif-plugged-fd47b104-c8ff-4b76-8f3a-53725f9f318c for instance with vm_state building and task_state spawning. Jul 27 09:30:25 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:25 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:25 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:25 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:26 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:26 user nova-compute[70374]: DEBUG nova.network.neutron [req-48ad7955-1a31-46c4-9493-6247e752746e req-698b72d5-5340-42a8-b20f-d23d5ca9e1d7 service nova] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Updated VIF entry in instance network info cache for port b825d9b4-15c4-4b47-a3f7-af9838d09458. {{(pid=70374) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Jul 27 09:30:26 user nova-compute[70374]: DEBUG nova.network.neutron [req-48ad7955-1a31-46c4-9493-6247e752746e req-698b72d5-5340-42a8-b20f-d23d5ca9e1d7 service nova] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Updating instance_info_cache with network_info: [{"id": "b825d9b4-15c4-4b47-a3f7-af9838d09458", "address": "fa:16:3e:b2:af:a6", "network": {"id": "b51b0e60-de31-47a5-908f-36f76e9fa620", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-754390872-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "0592f0be670742a181e24823955f378b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb825d9b4-15", "ovs_interfaceid": "b825d9b4-15c4-4b47-a3f7-af9838d09458", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:30:26 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-48ad7955-1a31-46c4-9493-6247e752746e req-698b72d5-5340-42a8-b20f-d23d5ca9e1d7 service nova] Releasing lock "refresh_cache-25214e8a-c626-46a7-b273-eb491c2fc91b" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:30:26 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:26 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:26 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:26 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:26 user nova-compute[70374]: DEBUG nova.virt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Emitting event Resumed> {{(pid=70374) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Jul 27 09:30:26 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] VM Resumed (Lifecycle Event) Jul 27 09:30:26 user nova-compute[70374]: DEBUG nova.compute.manager [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Instance event wait completed in 0 seconds for {{(pid=70374) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Jul 27 09:30:26 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Guest created on hypervisor {{(pid=70374) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4411}} Jul 27 09:30:26 user nova-compute[70374]: INFO nova.virt.libvirt.driver [-] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Instance spawned successfully. Jul 27 09:30:26 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:894}} Jul 27 09:30:26 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:30:26 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70374) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Jul 27 09:30:26 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Found default for hw_cdrom_bus of ide {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:26 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Found default for hw_disk_bus of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:26 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Found default for hw_input_bus of None {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:26 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Found default for hw_pointer_model of None {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:26 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Found default for hw_video_model of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:26 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Found default for hw_vif_model of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:26 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] During sync_power_state the instance has a pending task (spawning). Skip. Jul 27 09:30:26 user nova-compute[70374]: DEBUG nova.virt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Emitting event Started> {{(pid=70374) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Jul 27 09:30:26 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] VM Started (Lifecycle Event) Jul 27 09:30:26 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:30:26 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70374) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Jul 27 09:30:26 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] During sync_power_state the instance has a pending task (spawning). Skip. Jul 27 09:30:26 user nova-compute[70374]: INFO nova.compute.manager [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Took 13.93 seconds to spawn the instance on the hypervisor. Jul 27 09:30:26 user nova-compute[70374]: DEBUG nova.compute.manager [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:30:27 user nova-compute[70374]: INFO nova.compute.manager [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Took 14.76 seconds to build instance. Jul 27 09:30:27 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-fec1f2d0-8fc5-46cd-8d73-ac60877a0c48 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Lock "6a5593cd-3ab6-4859-b0fb-33a0ed702dc8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.906s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:27 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:27 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:27 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:27 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:27 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:27 user nova-compute[70374]: DEBUG nova.compute.manager [req-d0bc7ef6-0c05-4897-a7ee-975c06eda5d7 req-6182069e-042d-47de-a45e-24dfdb463306 service nova] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Received event network-vif-plugged-ce2809b5-cc30-4a29-9770-7df186b16406 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:30:27 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-d0bc7ef6-0c05-4897-a7ee-975c06eda5d7 req-6182069e-042d-47de-a45e-24dfdb463306 service nova] Acquiring lock "6ae93f34-ce7e-4ae4-a5ba-36508361bd54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:27 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-d0bc7ef6-0c05-4897-a7ee-975c06eda5d7 req-6182069e-042d-47de-a45e-24dfdb463306 service nova] Lock "6ae93f34-ce7e-4ae4-a5ba-36508361bd54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:27 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-d0bc7ef6-0c05-4897-a7ee-975c06eda5d7 req-6182069e-042d-47de-a45e-24dfdb463306 service nova] Lock "6ae93f34-ce7e-4ae4-a5ba-36508361bd54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:27 user nova-compute[70374]: DEBUG nova.compute.manager [req-d0bc7ef6-0c05-4897-a7ee-975c06eda5d7 req-6182069e-042d-47de-a45e-24dfdb463306 service nova] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] No waiting events found dispatching network-vif-plugged-ce2809b5-cc30-4a29-9770-7df186b16406 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:30:27 user nova-compute[70374]: WARNING nova.compute.manager [req-d0bc7ef6-0c05-4897-a7ee-975c06eda5d7 req-6182069e-042d-47de-a45e-24dfdb463306 service nova] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Received unexpected event network-vif-plugged-ce2809b5-cc30-4a29-9770-7df186b16406 for instance with vm_state building and task_state spawning. Jul 27 09:30:27 user nova-compute[70374]: DEBUG nova.compute.manager [req-d0bc7ef6-0c05-4897-a7ee-975c06eda5d7 req-6182069e-042d-47de-a45e-24dfdb463306 service nova] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Received event network-vif-plugged-ce2809b5-cc30-4a29-9770-7df186b16406 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:30:27 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-d0bc7ef6-0c05-4897-a7ee-975c06eda5d7 req-6182069e-042d-47de-a45e-24dfdb463306 service nova] Acquiring lock "6ae93f34-ce7e-4ae4-a5ba-36508361bd54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:27 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-d0bc7ef6-0c05-4897-a7ee-975c06eda5d7 req-6182069e-042d-47de-a45e-24dfdb463306 service nova] Lock "6ae93f34-ce7e-4ae4-a5ba-36508361bd54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:27 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-d0bc7ef6-0c05-4897-a7ee-975c06eda5d7 req-6182069e-042d-47de-a45e-24dfdb463306 service nova] Lock "6ae93f34-ce7e-4ae4-a5ba-36508361bd54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:27 user nova-compute[70374]: DEBUG nova.compute.manager [req-d0bc7ef6-0c05-4897-a7ee-975c06eda5d7 req-6182069e-042d-47de-a45e-24dfdb463306 service nova] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] No waiting events found dispatching network-vif-plugged-ce2809b5-cc30-4a29-9770-7df186b16406 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:30:27 user nova-compute[70374]: WARNING nova.compute.manager [req-d0bc7ef6-0c05-4897-a7ee-975c06eda5d7 req-6182069e-042d-47de-a45e-24dfdb463306 service nova] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Received unexpected event network-vif-plugged-ce2809b5-cc30-4a29-9770-7df186b16406 for instance with vm_state building and task_state spawning. Jul 27 09:30:27 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:27 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:27 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:27 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:27 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:28 user nova-compute[70374]: DEBUG nova.compute.manager [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Instance event wait completed in 0 seconds for {{(pid=70374) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Jul 27 09:30:28 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Guest created on hypervisor {{(pid=70374) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4411}} Jul 27 09:30:28 user nova-compute[70374]: DEBUG nova.virt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Emitting event Resumed> {{(pid=70374) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Jul 27 09:30:28 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] VM Resumed (Lifecycle Event) Jul 27 09:30:28 user nova-compute[70374]: INFO nova.virt.libvirt.driver [-] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Instance spawned successfully. Jul 27 09:30:28 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:894}} Jul 27 09:30:28 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:30:28 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Found default for hw_cdrom_bus of ide {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:28 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Found default for hw_disk_bus of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:28 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Found default for hw_input_bus of None {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:28 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Found default for hw_pointer_model of None {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:28 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Found default for hw_video_model of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:28 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Found default for hw_vif_model of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:28 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70374) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Jul 27 09:30:28 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] During sync_power_state the instance has a pending task (spawning). Skip. Jul 27 09:30:28 user nova-compute[70374]: DEBUG nova.virt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Emitting event Started> {{(pid=70374) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Jul 27 09:30:28 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] VM Started (Lifecycle Event) Jul 27 09:30:28 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:30:28 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70374) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Jul 27 09:30:28 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] During sync_power_state the instance has a pending task (spawning). Skip. Jul 27 09:30:28 user nova-compute[70374]: INFO nova.compute.manager [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Took 17.81 seconds to spawn the instance on the hypervisor. Jul 27 09:30:28 user nova-compute[70374]: DEBUG nova.compute.manager [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:30:28 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:28 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:28 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:28 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:28 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:28 user nova-compute[70374]: INFO nova.compute.manager [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Took 19.06 seconds to build instance. Jul 27 09:30:29 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-cdfda19c-0d10-40e6-9a38-7ff6ede4bafb tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Lock "fb5ccac9-1e45-4726-b681-cf34cf3fa521" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 19.240s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:29 user nova-compute[70374]: DEBUG nova.compute.manager [req-d3dd70fc-ddb8-4a8b-ba5a-960a5b022eba req-122bd9a1-655b-4670-b76f-4b58a916c592 service nova] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Received event network-vif-plugged-b825d9b4-15c4-4b47-a3f7-af9838d09458 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:30:29 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-d3dd70fc-ddb8-4a8b-ba5a-960a5b022eba req-122bd9a1-655b-4670-b76f-4b58a916c592 service nova] Acquiring lock "25214e8a-c626-46a7-b273-eb491c2fc91b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:29 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-d3dd70fc-ddb8-4a8b-ba5a-960a5b022eba req-122bd9a1-655b-4670-b76f-4b58a916c592 service nova] Lock "25214e8a-c626-46a7-b273-eb491c2fc91b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:29 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-d3dd70fc-ddb8-4a8b-ba5a-960a5b022eba req-122bd9a1-655b-4670-b76f-4b58a916c592 service nova] Lock "25214e8a-c626-46a7-b273-eb491c2fc91b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:29 user nova-compute[70374]: DEBUG nova.compute.manager [req-d3dd70fc-ddb8-4a8b-ba5a-960a5b022eba req-122bd9a1-655b-4670-b76f-4b58a916c592 service nova] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] No waiting events found dispatching network-vif-plugged-b825d9b4-15c4-4b47-a3f7-af9838d09458 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:30:29 user nova-compute[70374]: WARNING nova.compute.manager [req-d3dd70fc-ddb8-4a8b-ba5a-960a5b022eba req-122bd9a1-655b-4670-b76f-4b58a916c592 service nova] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Received unexpected event network-vif-plugged-b825d9b4-15c4-4b47-a3f7-af9838d09458 for instance with vm_state building and task_state spawning. Jul 27 09:30:29 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:29 user nova-compute[70374]: DEBUG nova.compute.manager [req-1ebabe94-93e6-4b27-b197-7242c91fa071 req-b560cb76-90fe-41dd-bc3a-e5b961f27634 service nova] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Received event network-vif-plugged-e6dc311e-c8d0-4b0c-bf42-96919f9d6f02 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:30:29 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-1ebabe94-93e6-4b27-b197-7242c91fa071 req-b560cb76-90fe-41dd-bc3a-e5b961f27634 service nova] Acquiring lock "8640c525-e6ba-4bf8-9fe0-2c08155dd1cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:29 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-1ebabe94-93e6-4b27-b197-7242c91fa071 req-b560cb76-90fe-41dd-bc3a-e5b961f27634 service nova] Lock "8640c525-e6ba-4bf8-9fe0-2c08155dd1cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:29 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-1ebabe94-93e6-4b27-b197-7242c91fa071 req-b560cb76-90fe-41dd-bc3a-e5b961f27634 service nova] Lock "8640c525-e6ba-4bf8-9fe0-2c08155dd1cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:29 user nova-compute[70374]: DEBUG nova.compute.manager [req-1ebabe94-93e6-4b27-b197-7242c91fa071 req-b560cb76-90fe-41dd-bc3a-e5b961f27634 service nova] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] No waiting events found dispatching network-vif-plugged-e6dc311e-c8d0-4b0c-bf42-96919f9d6f02 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:30:29 user nova-compute[70374]: WARNING nova.compute.manager [req-1ebabe94-93e6-4b27-b197-7242c91fa071 req-b560cb76-90fe-41dd-bc3a-e5b961f27634 service nova] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Received unexpected event network-vif-plugged-e6dc311e-c8d0-4b0c-bf42-96919f9d6f02 for instance with vm_state building and task_state spawning. Jul 27 09:30:29 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:30 user nova-compute[70374]: DEBUG nova.virt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Emitting event Resumed> {{(pid=70374) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Jul 27 09:30:30 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] VM Resumed (Lifecycle Event) Jul 27 09:30:30 user nova-compute[70374]: DEBUG nova.compute.manager [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Instance event wait completed in 0 seconds for {{(pid=70374) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Jul 27 09:30:30 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Guest created on hypervisor {{(pid=70374) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4411}} Jul 27 09:30:30 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:30:30 user nova-compute[70374]: INFO nova.virt.libvirt.driver [-] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Instance spawned successfully. Jul 27 09:30:30 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:894}} Jul 27 09:30:30 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70374) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Jul 27 09:30:30 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] During sync_power_state the instance has a pending task (spawning). Skip. Jul 27 09:30:30 user nova-compute[70374]: DEBUG nova.virt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Emitting event Started> {{(pid=70374) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Jul 27 09:30:30 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] VM Started (Lifecycle Event) Jul 27 09:30:30 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Found default for hw_cdrom_bus of ide {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:30 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Found default for hw_disk_bus of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:30 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Found default for hw_input_bus of None {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:30 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Found default for hw_pointer_model of None {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:30 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Found default for hw_video_model of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:30 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Found default for hw_vif_model of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:30 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:30:30 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70374) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Jul 27 09:30:30 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] During sync_power_state the instance has a pending task (spawning). Skip. Jul 27 09:30:30 user nova-compute[70374]: INFO nova.compute.manager [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Took 16.88 seconds to spawn the instance on the hypervisor. Jul 27 09:30:30 user nova-compute[70374]: DEBUG nova.compute.manager [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:30:30 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:30 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:30 user nova-compute[70374]: INFO nova.compute.manager [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Took 18.14 seconds to build instance. Jul 27 09:30:30 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-5a50d6dc-fbe1-45e4-a8f9-8c00e88048a3 tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Lock "6ae93f34-ce7e-4ae4-a5ba-36508361bd54" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 18.295s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:31 user nova-compute[70374]: DEBUG nova.compute.manager [req-46cc828c-a689-4ba5-9c73-56031dda7f05 req-3b81f859-cbe6-4b03-899e-773dcce7aae5 service nova] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Received event network-vif-plugged-b825d9b4-15c4-4b47-a3f7-af9838d09458 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:30:31 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-46cc828c-a689-4ba5-9c73-56031dda7f05 req-3b81f859-cbe6-4b03-899e-773dcce7aae5 service nova] Acquiring lock "25214e8a-c626-46a7-b273-eb491c2fc91b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:31 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-46cc828c-a689-4ba5-9c73-56031dda7f05 req-3b81f859-cbe6-4b03-899e-773dcce7aae5 service nova] Lock "25214e8a-c626-46a7-b273-eb491c2fc91b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.002s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:31 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-46cc828c-a689-4ba5-9c73-56031dda7f05 req-3b81f859-cbe6-4b03-899e-773dcce7aae5 service nova] Lock "25214e8a-c626-46a7-b273-eb491c2fc91b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:31 user nova-compute[70374]: DEBUG nova.compute.manager [req-46cc828c-a689-4ba5-9c73-56031dda7f05 req-3b81f859-cbe6-4b03-899e-773dcce7aae5 service nova] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] No waiting events found dispatching network-vif-plugged-b825d9b4-15c4-4b47-a3f7-af9838d09458 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:30:31 user nova-compute[70374]: WARNING nova.compute.manager [req-46cc828c-a689-4ba5-9c73-56031dda7f05 req-3b81f859-cbe6-4b03-899e-773dcce7aae5 service nova] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Received unexpected event network-vif-plugged-b825d9b4-15c4-4b47-a3f7-af9838d09458 for instance with vm_state building and task_state spawning. Jul 27 09:30:31 user nova-compute[70374]: DEBUG nova.virt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Emitting event Resumed> {{(pid=70374) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Jul 27 09:30:31 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] VM Resumed (Lifecycle Event) Jul 27 09:30:31 user nova-compute[70374]: DEBUG nova.compute.manager [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Instance event wait completed in 0 seconds for {{(pid=70374) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Jul 27 09:30:31 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Guest created on hypervisor {{(pid=70374) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4411}} Jul 27 09:30:31 user nova-compute[70374]: INFO nova.virt.libvirt.driver [-] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Instance spawned successfully. Jul 27 09:30:31 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:894}} Jul 27 09:30:31 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:30:31 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70374) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Jul 27 09:30:31 user nova-compute[70374]: DEBUG nova.compute.manager [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Instance event wait completed in 0 seconds for {{(pid=70374) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Jul 27 09:30:31 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Guest created on hypervisor {{(pid=70374) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4411}} Jul 27 09:30:31 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Found default for hw_cdrom_bus of ide {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:31 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Found default for hw_disk_bus of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:31 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Found default for hw_input_bus of None {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:31 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Found default for hw_pointer_model of None {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:31 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Found default for hw_video_model of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:31 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Found default for hw_vif_model of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:31 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] During sync_power_state the instance has a pending task (spawning). Skip. Jul 27 09:30:31 user nova-compute[70374]: DEBUG nova.virt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Emitting event Started> {{(pid=70374) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Jul 27 09:30:31 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] VM Started (Lifecycle Event) Jul 27 09:30:31 user nova-compute[70374]: INFO nova.virt.libvirt.driver [-] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Instance spawned successfully. Jul 27 09:30:31 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:894}} Jul 27 09:30:31 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:31 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:30:31 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:31 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Found default for hw_cdrom_bus of ide {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:31 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Found default for hw_disk_bus of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:31 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Found default for hw_input_bus of None {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:31 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Found default for hw_pointer_model of None {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:31 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Found default for hw_video_model of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:31 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Found default for hw_vif_model of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:31 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:31 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70374) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Jul 27 09:30:31 user nova-compute[70374]: DEBUG nova.compute.manager [req-f29206a5-a4fd-434d-a971-fec17fbd62d9 req-d4627134-b222-4574-ad70-3e3d20080aee service nova] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Received event network-vif-plugged-e6dc311e-c8d0-4b0c-bf42-96919f9d6f02 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:30:31 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-f29206a5-a4fd-434d-a971-fec17fbd62d9 req-d4627134-b222-4574-ad70-3e3d20080aee service nova] Acquiring lock "8640c525-e6ba-4bf8-9fe0-2c08155dd1cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:31 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-f29206a5-a4fd-434d-a971-fec17fbd62d9 req-d4627134-b222-4574-ad70-3e3d20080aee service nova] Lock "8640c525-e6ba-4bf8-9fe0-2c08155dd1cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:31 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-f29206a5-a4fd-434d-a971-fec17fbd62d9 req-d4627134-b222-4574-ad70-3e3d20080aee service nova] Lock "8640c525-e6ba-4bf8-9fe0-2c08155dd1cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:31 user nova-compute[70374]: DEBUG nova.compute.manager [req-f29206a5-a4fd-434d-a971-fec17fbd62d9 req-d4627134-b222-4574-ad70-3e3d20080aee service nova] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] No waiting events found dispatching network-vif-plugged-e6dc311e-c8d0-4b0c-bf42-96919f9d6f02 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:30:31 user nova-compute[70374]: WARNING nova.compute.manager [req-f29206a5-a4fd-434d-a971-fec17fbd62d9 req-d4627134-b222-4574-ad70-3e3d20080aee service nova] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Received unexpected event network-vif-plugged-e6dc311e-c8d0-4b0c-bf42-96919f9d6f02 for instance with vm_state building and task_state spawning. Jul 27 09:30:31 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] During sync_power_state the instance has a pending task (spawning). Skip. Jul 27 09:30:31 user nova-compute[70374]: DEBUG nova.virt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Emitting event Resumed> {{(pid=70374) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Jul 27 09:30:31 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] VM Resumed (Lifecycle Event) Jul 27 09:30:31 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:30:31 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70374) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Jul 27 09:30:31 user nova-compute[70374]: INFO nova.compute.manager [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Took 14.52 seconds to spawn the instance on the hypervisor. Jul 27 09:30:31 user nova-compute[70374]: DEBUG nova.compute.manager [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:30:31 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] During sync_power_state the instance has a pending task (spawning). Skip. Jul 27 09:30:31 user nova-compute[70374]: DEBUG nova.virt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Emitting event Started> {{(pid=70374) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Jul 27 09:30:31 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] VM Started (Lifecycle Event) Jul 27 09:30:32 user nova-compute[70374]: INFO nova.compute.manager [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Took 15.97 seconds to spawn the instance on the hypervisor. Jul 27 09:30:32 user nova-compute[70374]: DEBUG nova.compute.manager [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:30:32 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:30:32 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70374) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Jul 27 09:30:32 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] During sync_power_state the instance has a pending task (spawning). Skip. Jul 27 09:30:32 user nova-compute[70374]: INFO nova.compute.manager [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Took 16.90 seconds to build instance. Jul 27 09:30:32 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-7a680851-07bd-4ef5-bc9b-c2b59b67dca7 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Lock "8640c525-e6ba-4bf8-9fe0-2c08155dd1cb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 17.094s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:32 user nova-compute[70374]: INFO nova.compute.manager [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Took 15.69 seconds to build instance. Jul 27 09:30:32 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-571092ad-27a5-4e72-b88c-d36bcbadaa88 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Lock "25214e8a-c626-46a7-b273-eb491c2fc91b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.850s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:33 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:34 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:35 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:36 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:38 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Acquiring lock "42f4c546-47e4-485b-be29-4081c7557bad" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:38 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Lock "42f4c546-47e4-485b-be29-4081c7557bad" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.005s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:38 user nova-compute[70374]: DEBUG nova.compute.manager [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Starting instance... {{(pid=70374) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Jul 27 09:30:38 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:38 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:38 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:38 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70374) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Jul 27 09:30:38 user nova-compute[70374]: INFO nova.compute.claims [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Claim successful on node user Jul 27 09:30:39 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:39 user nova-compute[70374]: DEBUG nova.compute.provider_tree [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Inventory has not changed in ProviderTree for provider: f7548644-4a09-4ad8-9aa6-6e05d85a9f5b {{(pid=70374) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Jul 27 09:30:39 user nova-compute[70374]: DEBUG nova.scheduler.client.report [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Inventory has not changed for provider f7548644-4a09-4ad8-9aa6-6e05d85a9f5b based on inventory data: {'MEMORY_MB': {'total': 16011, 'reserved': 512, 'min_unit': 1, 'max_unit': 16011, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70374) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Jul 27 09:30:39 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.780s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:39 user nova-compute[70374]: DEBUG nova.compute.manager [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Start building networks asynchronously for instance. {{(pid=70374) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Jul 27 09:30:39 user nova-compute[70374]: DEBUG nova.compute.manager [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Allocating IP information in the background. {{(pid=70374) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} Jul 27 09:30:39 user nova-compute[70374]: DEBUG nova.network.neutron [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] allocate_for_instance() {{(pid=70374) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Jul 27 09:30:39 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Jul 27 09:30:39 user nova-compute[70374]: DEBUG nova.compute.manager [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Start building block device mappings for instance. {{(pid=70374) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Jul 27 09:30:40 user nova-compute[70374]: DEBUG nova.compute.manager [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Start spawning the instance on the hypervisor. {{(pid=70374) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Jul 27 09:30:40 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Creating instance directory {{(pid=70374) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Jul 27 09:30:40 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Creating image(s) Jul 27 09:30:40 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Acquiring lock "/opt/stack/data/nova/instances/42f4c546-47e4-485b-be29-4081c7557bad/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:40 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Lock "/opt/stack/data/nova/instances/42f4c546-47e4-485b-be29-4081c7557bad/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:40 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Lock "/opt/stack/data/nova/instances/42f4c546-47e4-485b-be29-4081c7557bad/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:40 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:40 user nova-compute[70374]: DEBUG nova.policy [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9eecf77414d9472e9e3cd10aec673afb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'aa3b4d25fdc94c4ea2e148e16478b40c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70374) authorize /opt/stack/nova/nova/policy.py:203}} Jul 27 09:30:40 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.331s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:40 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Acquiring lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:40 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:40 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:40 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.225s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:40 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1,backing_fmt=raw /opt/stack/data/nova/instances/42f4c546-47e4-485b-be29-4081c7557bad/disk 1073741824 {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:40 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1,backing_fmt=raw /opt/stack/data/nova/instances/42f4c546-47e4-485b-be29-4081c7557bad/disk 1073741824" returned: 0 in 0.146s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:40 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.379s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:40 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:41 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.249s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:41 user nova-compute[70374]: DEBUG nova.virt.disk.api [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Checking if we can resize image /opt/stack/data/nova/instances/42f4c546-47e4-485b-be29-4081c7557bad/disk. size=1073741824 {{(pid=70374) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Jul 27 09:30:41 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/42f4c546-47e4-485b-be29-4081c7557bad/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:41 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/42f4c546-47e4-485b-be29-4081c7557bad/disk --force-share --output=json" returned: 0 in 0.221s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:41 user nova-compute[70374]: DEBUG nova.virt.disk.api [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Cannot resize image /opt/stack/data/nova/instances/42f4c546-47e4-485b-be29-4081c7557bad/disk to a smaller size. {{(pid=70374) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Jul 27 09:30:41 user nova-compute[70374]: DEBUG nova.objects.instance [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Lazy-loading 'migration_context' on Instance uuid 42f4c546-47e4-485b-be29-4081c7557bad {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:30:41 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Created local disks {{(pid=70374) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Jul 27 09:30:41 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Ensure instance console log exists: /opt/stack/data/nova/instances/42f4c546-47e4-485b-be29-4081c7557bad/console.log {{(pid=70374) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Jul 27 09:30:41 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:41 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:41 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:42 user nova-compute[70374]: DEBUG nova.network.neutron [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Successfully created port: cc9444e6-2840-433c-b7c9-c0df023f6e43 {{(pid=70374) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} Jul 27 09:30:43 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:43 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:44 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:45 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Acquiring lock "309c9c26-4a0f-45db-bb3a-595b19f3f627" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:45 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Lock "309c9c26-4a0f-45db-bb3a-595b19f3f627" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.005s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:45 user nova-compute[70374]: DEBUG nova.compute.manager [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Starting instance... {{(pid=70374) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Jul 27 09:30:45 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:45 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:45 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70374) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Jul 27 09:30:45 user nova-compute[70374]: INFO nova.compute.claims [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Claim successful on node user Jul 27 09:30:45 user nova-compute[70374]: DEBUG nova.network.neutron [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Successfully updated port: cc9444e6-2840-433c-b7c9-c0df023f6e43 {{(pid=70374) _update_port /opt/stack/nova/nova/network/neutron.py:586}} Jul 27 09:30:45 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Acquiring lock "refresh_cache-42f4c546-47e4-485b-be29-4081c7557bad" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:30:45 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Acquired lock "refresh_cache-42f4c546-47e4-485b-be29-4081c7557bad" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:30:45 user nova-compute[70374]: DEBUG nova.network.neutron [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Building network info cache for instance {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Jul 27 09:30:45 user nova-compute[70374]: DEBUG nova.compute.manager [req-507c555b-d164-4237-8a47-3c15957e423c req-bc88dd74-b059-4699-afb5-1997246f8556 service nova] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Received event network-changed-cc9444e6-2840-433c-b7c9-c0df023f6e43 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:30:45 user nova-compute[70374]: DEBUG nova.compute.manager [req-507c555b-d164-4237-8a47-3c15957e423c req-bc88dd74-b059-4699-afb5-1997246f8556 service nova] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Refreshing instance network info cache due to event network-changed-cc9444e6-2840-433c-b7c9-c0df023f6e43. {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Jul 27 09:30:45 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-507c555b-d164-4237-8a47-3c15957e423c req-bc88dd74-b059-4699-afb5-1997246f8556 service nova] Acquiring lock "refresh_cache-42f4c546-47e4-485b-be29-4081c7557bad" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:30:45 user nova-compute[70374]: DEBUG nova.network.neutron [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Instance cache missing network info. {{(pid=70374) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Acquiring lock "d35fe056-8279-479a-a673-6c61e5ec6933" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Lock "d35fe056-8279-479a-a673-6c61e5ec6933" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.compute.provider_tree [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Inventory has not changed in ProviderTree for provider: f7548644-4a09-4ad8-9aa6-6e05d85a9f5b {{(pid=70374) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.compute.manager [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Starting instance... {{(pid=70374) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.scheduler.client.report [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Inventory has not changed for provider f7548644-4a09-4ad8-9aa6-6e05d85a9f5b based on inventory data: {'MEMORY_MB': {'total': 16011, 'reserved': 512, 'min_unit': 1, 'max_unit': 16011, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70374) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.955s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.compute.manager [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Start building networks asynchronously for instance. {{(pid=70374) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70374) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Jul 27 09:30:46 user nova-compute[70374]: INFO nova.compute.claims [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Claim successful on node user Jul 27 09:30:46 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Acquiring lock "773917c6-56d7-4491-a760-05f51593b7f0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Lock "773917c6-56d7-4491-a760-05f51593b7f0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.compute.manager [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Starting instance... {{(pid=70374) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.network.neutron [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Updating instance_info_cache with network_info: [{"id": "cc9444e6-2840-433c-b7c9-c0df023f6e43", "address": "fa:16:3e:bf:c6:71", "network": {"id": "f60fad83-a431-45cc-8f33-49bbd1e52e4c", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2025296099-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aa3b4d25fdc94c4ea2e148e16478b40c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc9444e6-28", "ovs_interfaceid": "cc9444e6-2840-433c-b7c9-c0df023f6e43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.compute.manager [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Allocating IP information in the background. {{(pid=70374) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.network.neutron [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] allocate_for_instance() {{(pid=70374) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Releasing lock "refresh_cache-42f4c546-47e4-485b-be29-4081c7557bad" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.compute.manager [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Instance network_info: |[{"id": "cc9444e6-2840-433c-b7c9-c0df023f6e43", "address": "fa:16:3e:bf:c6:71", "network": {"id": "f60fad83-a431-45cc-8f33-49bbd1e52e4c", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2025296099-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aa3b4d25fdc94c4ea2e148e16478b40c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc9444e6-28", "ovs_interfaceid": "cc9444e6-2840-433c-b7c9-c0df023f6e43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70374) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-507c555b-d164-4237-8a47-3c15957e423c req-bc88dd74-b059-4699-afb5-1997246f8556 service nova] Acquired lock "refresh_cache-42f4c546-47e4-485b-be29-4081c7557bad" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.network.neutron [req-507c555b-d164-4237-8a47-3c15957e423c req-bc88dd74-b059-4699-afb5-1997246f8556 service nova] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Refreshing network info cache for port cc9444e6-2840-433c-b7c9-c0df023f6e43 {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Start _get_guest_xml network_info=[{"id": "cc9444e6-2840-433c-b7c9-c0df023f6e43", "address": "fa:16:3e:bf:c6:71", "network": {"id": "f60fad83-a431-45cc-8f33-49bbd1e52e4c", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2025296099-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aa3b4d25fdc94c4ea2e148e16478b40c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc9444e6-28", "ovs_interfaceid": "cc9444e6-2840-433c-b7c9-c0df023f6e43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-07-27T09:26:27Z,direct_url=,disk_format='qcow2',id=35458adf-261a-4e0b-a4db-b243619b2394,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='37aff788807d4b2aafc10d355583f7c7',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-07-27T09:26:29Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'size': 0, 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'image_id': '35458adf-261a-4e0b-a4db-b243619b2394'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70374) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Jul 27 09:30:46 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:30:46 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:30:46 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Ignoring supplied device name: /dev/sda. Libvirt can't honour user-supplied dev names Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Searching host: 'user' for CPU controller through CGroups V1... {{(pid=70374) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] CPU controller missing on host. {{(pid=70374) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.compute.manager [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Start building block device mappings for instance. {{(pid=70374) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Searching host: 'user' for CPU controller through CGroups V2... {{(pid=70374) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] CPU controller found on host. {{(pid=70374) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70374) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-07-27T09:26:16Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1',id=6,is_public=True,memory_mb=512,name='m1.tiny',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-07-27T09:26:27Z,direct_url=,disk_format='qcow2',id=35458adf-261a-4e0b-a4db-b243619b2394,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='37aff788807d4b2aafc10d355583f7c7',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-07-27T09:26:29Z,virtual_size=,visibility=), allow threads: True {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Flavor limits 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Image limits 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Flavor pref 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Image pref 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70374) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Got 1 possible topologies {{(pid=70374) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.virt.libvirt.vif [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-07-27T09:30:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1688768109',display_name='tempest-VolumesAdminNegativeTest-server-1688768109',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(6),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-1688768109',id=7,image_ref='35458adf-261a-4e0b-a4db-b243619b2394',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB50mdHMzOWpotVJV8Hy2B+a9kMnUfx8ncrgXuU5l2s3b2+0s8/IzVYVCK3PIMbOCNWJ+oVutS4E8rj/IvnO7glUGqYpvl5OiKzQBseAPNzkTYHLXKqMWqY/freXFLIXKA==',key_name='tempest-keypair-110342931',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='aa3b4d25fdc94c4ea2e148e16478b40c',ramdisk_id='',reservation_id='r-xt39zy6t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='35458adf-261a-4e0b-a4db-b243619b2394',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-1398001736',owner_user_name='tempest-VolumesAdminNegativeTest-1398001736-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-07-27T09:30:40Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9eecf77414d9472e9e3cd10aec673afb',uuid=42f4c546-47e4-485b-be29-4081c7557bad,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cc9444e6-2840-433c-b7c9-c0df023f6e43", "address": "fa:16:3e:bf:c6:71", "network": {"id": "f60fad83-a431-45cc-8f33-49bbd1e52e4c", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2025296099-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aa3b4d25fdc94c4ea2e148e16478b40c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc9444e6-28", "ovs_interfaceid": "cc9444e6-2840-433c-b7c9-c0df023f6e43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70374) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Converting VIF {"id": "cc9444e6-2840-433c-b7c9-c0df023f6e43", "address": "fa:16:3e:bf:c6:71", "network": {"id": "f60fad83-a431-45cc-8f33-49bbd1e52e4c", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2025296099-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aa3b4d25fdc94c4ea2e148e16478b40c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc9444e6-28", "ovs_interfaceid": "cc9444e6-2840-433c-b7c9-c0df023f6e43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bf:c6:71,bridge_name='br-int',has_traffic_filtering=True,id=cc9444e6-2840-433c-b7c9-c0df023f6e43,network=Network(f60fad83-a431-45cc-8f33-49bbd1e52e4c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc9444e6-28') {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.objects.instance [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Lazy-loading 'pci_devices' on Instance uuid 42f4c546-47e4-485b-be29-4081c7557bad {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.policy [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '103aa251c26c4987814bc5973d86e601', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cdd638e5400740279443a374e3e570d4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70374) authorize /opt/stack/nova/nova/policy.py:203}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] End _get_guest_xml xml= Jul 27 09:30:46 user nova-compute[70374]: 42f4c546-47e4-485b-be29-4081c7557bad Jul 27 09:30:46 user nova-compute[70374]: instance-00000007 Jul 27 09:30:46 user nova-compute[70374]: 524288 Jul 27 09:30:46 user nova-compute[70374]: 1 Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: tempest-VolumesAdminNegativeTest-server-1688768109 Jul 27 09:30:46 user nova-compute[70374]: 2023-07-27 09:30:46 Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: 512 Jul 27 09:30:46 user nova-compute[70374]: 1 Jul 27 09:30:46 user nova-compute[70374]: 0 Jul 27 09:30:46 user nova-compute[70374]: 0 Jul 27 09:30:46 user nova-compute[70374]: 1 Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: tempest-VolumesAdminNegativeTest-1398001736-project-member Jul 27 09:30:46 user nova-compute[70374]: tempest-VolumesAdminNegativeTest-1398001736 Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: OpenStack Foundation Jul 27 09:30:46 user nova-compute[70374]: OpenStack Nova Jul 27 09:30:46 user nova-compute[70374]: 0.0.0 Jul 27 09:30:46 user nova-compute[70374]: 42f4c546-47e4-485b-be29-4081c7557bad Jul 27 09:30:46 user nova-compute[70374]: 42f4c546-47e4-485b-be29-4081c7557bad Jul 27 09:30:46 user nova-compute[70374]: Virtual Machine Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: hvm Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Nehalem Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: /dev/urandom Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: Jul 27 09:30:46 user nova-compute[70374]: {{(pid=70374) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.virt.libvirt.vif [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-07-27T09:30:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1688768109',display_name='tempest-VolumesAdminNegativeTest-server-1688768109',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(6),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-1688768109',id=7,image_ref='35458adf-261a-4e0b-a4db-b243619b2394',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB50mdHMzOWpotVJV8Hy2B+a9kMnUfx8ncrgXuU5l2s3b2+0s8/IzVYVCK3PIMbOCNWJ+oVutS4E8rj/IvnO7glUGqYpvl5OiKzQBseAPNzkTYHLXKqMWqY/freXFLIXKA==',key_name='tempest-keypair-110342931',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='aa3b4d25fdc94c4ea2e148e16478b40c',ramdisk_id='',reservation_id='r-xt39zy6t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='35458adf-261a-4e0b-a4db-b243619b2394',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-1398001736',owner_user_name='tempest-VolumesAdminNegativeTest-1398001736-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-07-27T09:30:40Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9eecf77414d9472e9e3cd10aec673afb',uuid=42f4c546-47e4-485b-be29-4081c7557bad,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cc9444e6-2840-433c-b7c9-c0df023f6e43", "address": "fa:16:3e:bf:c6:71", "network": {"id": "f60fad83-a431-45cc-8f33-49bbd1e52e4c", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2025296099-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aa3b4d25fdc94c4ea2e148e16478b40c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc9444e6-28", "ovs_interfaceid": "cc9444e6-2840-433c-b7c9-c0df023f6e43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Converting VIF {"id": "cc9444e6-2840-433c-b7c9-c0df023f6e43", "address": "fa:16:3e:bf:c6:71", "network": {"id": "f60fad83-a431-45cc-8f33-49bbd1e52e4c", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2025296099-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aa3b4d25fdc94c4ea2e148e16478b40c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc9444e6-28", "ovs_interfaceid": "cc9444e6-2840-433c-b7c9-c0df023f6e43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bf:c6:71,bridge_name='br-int',has_traffic_filtering=True,id=cc9444e6-2840-433c-b7c9-c0df023f6e43,network=Network(f60fad83-a431-45cc-8f33-49bbd1e52e4c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc9444e6-28') {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG os_vif [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bf:c6:71,bridge_name='br-int',has_traffic_filtering=True,id=cc9444e6-2840-433c-b7c9-c0df023f6e43,network=Network(f60fad83-a431-45cc-8f33-49bbd1e52e4c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc9444e6-28') {{(pid=70374) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcc9444e6-28, may_exist=True, interface_attrs={}) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcc9444e6-28, col_values=(('external_ids', {'iface-id': 'cc9444e6-2840-433c-b7c9-c0df023f6e43', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bf:c6:71', 'vm-uuid': '42f4c546-47e4-485b-be29-4081c7557bad'}),), if_exists=True) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:46 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Jul 27 09:30:47 user nova-compute[70374]: INFO os_vif [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bf:c6:71,bridge_name='br-int',has_traffic_filtering=True,id=cc9444e6-2840-433c-b7c9-c0df023f6e43,network=Network(f60fad83-a431-45cc-8f33-49bbd1e52e4c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc9444e6-28') Jul 27 09:30:47 user nova-compute[70374]: DEBUG nova.compute.manager [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Start spawning the instance on the hypervisor. {{(pid=70374) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Jul 27 09:30:47 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Creating instance directory {{(pid=70374) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Jul 27 09:30:47 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Creating image(s) Jul 27 09:30:47 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Acquiring lock "/opt/stack/data/nova/instances/309c9c26-4a0f-45db-bb3a-595b19f3f627/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:47 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Lock "/opt/stack/data/nova/instances/309c9c26-4a0f-45db-bb3a-595b19f3f627/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:47 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Lock "/opt/stack/data/nova/instances/309c9c26-4a0f-45db-bb3a-595b19f3f627/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.005s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:47 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Acquiring lock "f45ce42723178ed0c3f545cbb0d68bcf7bb43056" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:47 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Lock "f45ce42723178ed0c3f545cbb0d68bcf7bb43056" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:47 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] No BDM found with device name vda, not building metadata. {{(pid=70374) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Jul 27 09:30:47 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] No VIF found with MAC fa:16:3e:bf:c6:71, not building metadata {{(pid=70374) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Jul 27 09:30:47 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/f45ce42723178ed0c3f545cbb0d68bcf7bb43056.part --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:47 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/f45ce42723178ed0c3f545cbb0d68bcf7bb43056.part --force-share --output=json" returned: 0 in 0.197s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:47 user nova-compute[70374]: DEBUG nova.virt.images [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] 726e1210-3ce3-486f-9417-95adaf9ac235 was qcow2, converting to raw {{(pid=70374) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Jul 27 09:30:47 user nova-compute[70374]: DEBUG nova.privsep.utils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=70374) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Jul 27 09:30:47 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/f45ce42723178ed0c3f545cbb0d68bcf7bb43056.part /opt/stack/data/nova/instances/_base/f45ce42723178ed0c3f545cbb0d68bcf7bb43056.converted {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:47 user nova-compute[70374]: DEBUG nova.compute.provider_tree [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Inventory has not changed in ProviderTree for provider: f7548644-4a09-4ad8-9aa6-6e05d85a9f5b {{(pid=70374) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Jul 27 09:30:47 user nova-compute[70374]: DEBUG nova.scheduler.client.report [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Inventory has not changed for provider f7548644-4a09-4ad8-9aa6-6e05d85a9f5b based on inventory data: {'MEMORY_MB': {'total': 16011, 'reserved': 512, 'min_unit': 1, 'max_unit': 16011, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70374) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Jul 27 09:30:47 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/f45ce42723178ed0c3f545cbb0d68bcf7bb43056.part /opt/stack/data/nova/instances/_base/f45ce42723178ed0c3f545cbb0d68bcf7bb43056.converted" returned: 0 in 0.230s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:47 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/f45ce42723178ed0c3f545cbb0d68bcf7bb43056.converted --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:48 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.794s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:48 user nova-compute[70374]: DEBUG nova.compute.manager [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Start building networks asynchronously for instance. {{(pid=70374) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Jul 27 09:30:48 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 1.298s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:48 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70374) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Jul 27 09:30:48 user nova-compute[70374]: INFO nova.compute.claims [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Claim successful on node user Jul 27 09:30:48 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/f45ce42723178ed0c3f545cbb0d68bcf7bb43056.converted --force-share --output=json" returned: 0 in 0.164s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:48 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Lock "f45ce42723178ed0c3f545cbb0d68bcf7bb43056" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 1.127s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:48 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/f45ce42723178ed0c3f545cbb0d68bcf7bb43056 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:48 user nova-compute[70374]: DEBUG nova.compute.manager [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Allocating IP information in the background. {{(pid=70374) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} Jul 27 09:30:48 user nova-compute[70374]: DEBUG nova.network.neutron [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] allocate_for_instance() {{(pid=70374) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Jul 27 09:30:48 user nova-compute[70374]: DEBUG nova.network.neutron [req-507c555b-d164-4237-8a47-3c15957e423c req-bc88dd74-b059-4699-afb5-1997246f8556 service nova] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Updated VIF entry in instance network info cache for port cc9444e6-2840-433c-b7c9-c0df023f6e43. {{(pid=70374) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Jul 27 09:30:48 user nova-compute[70374]: DEBUG nova.network.neutron [req-507c555b-d164-4237-8a47-3c15957e423c req-bc88dd74-b059-4699-afb5-1997246f8556 service nova] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Updating instance_info_cache with network_info: [{"id": "cc9444e6-2840-433c-b7c9-c0df023f6e43", "address": "fa:16:3e:bf:c6:71", "network": {"id": "f60fad83-a431-45cc-8f33-49bbd1e52e4c", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2025296099-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aa3b4d25fdc94c4ea2e148e16478b40c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc9444e6-28", "ovs_interfaceid": "cc9444e6-2840-433c-b7c9-c0df023f6e43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:30:48 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Jul 27 09:30:48 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-507c555b-d164-4237-8a47-3c15957e423c req-bc88dd74-b059-4699-afb5-1997246f8556 service nova] Releasing lock "refresh_cache-42f4c546-47e4-485b-be29-4081c7557bad" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:30:48 user nova-compute[70374]: DEBUG nova.compute.manager [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Start building block device mappings for instance. {{(pid=70374) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Jul 27 09:30:48 user nova-compute[70374]: DEBUG nova.network.neutron [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Successfully created port: 626b287e-22ee-49d7-8ec3-c1a0660bd5d8 {{(pid=70374) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} Jul 27 09:30:48 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/f45ce42723178ed0c3f545cbb0d68bcf7bb43056 --force-share --output=json" returned: 0 in 0.214s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:48 user nova-compute[70374]: DEBUG nova.policy [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9597f10b87aa426b812375b1770d3095', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2010f7a57b654bee8736f3ed8d805b2c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70374) authorize /opt/stack/nova/nova/policy.py:203}} Jul 27 09:30:48 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Acquiring lock "f45ce42723178ed0c3f545cbb0d68bcf7bb43056" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:48 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Lock "f45ce42723178ed0c3f545cbb0d68bcf7bb43056" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:48 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/f45ce42723178ed0c3f545cbb0d68bcf7bb43056 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:48 user nova-compute[70374]: DEBUG nova.compute.manager [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Start spawning the instance on the hypervisor. {{(pid=70374) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Jul 27 09:30:48 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Creating instance directory {{(pid=70374) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Jul 27 09:30:48 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Creating image(s) Jul 27 09:30:48 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Acquiring lock "/opt/stack/data/nova/instances/d35fe056-8279-479a-a673-6c61e5ec6933/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:48 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Lock "/opt/stack/data/nova/instances/d35fe056-8279-479a-a673-6c61e5ec6933/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:48 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Lock "/opt/stack/data/nova/instances/d35fe056-8279-479a-a673-6c61e5ec6933/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.003s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:48 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:48 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/f45ce42723178ed0c3f545cbb0d68bcf7bb43056 --force-share --output=json" returned: 0 in 0.242s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:48 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/f45ce42723178ed0c3f545cbb0d68bcf7bb43056,backing_fmt=raw /opt/stack/data/nova/instances/309c9c26-4a0f-45db-bb3a-595b19f3f627/disk 1073741824 {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:48 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:48 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.178s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:48 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Acquiring lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:48 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:48 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:48 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/f45ce42723178ed0c3f545cbb0d68bcf7bb43056,backing_fmt=raw /opt/stack/data/nova/instances/309c9c26-4a0f-45db-bb3a-595b19f3f627/disk 1073741824" returned: 0 in 0.138s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:48 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Lock "f45ce42723178ed0c3f545cbb0d68bcf7bb43056" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.400s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:48 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/f45ce42723178ed0c3f545cbb0d68bcf7bb43056 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:48 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:48 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.217s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:48 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1,backing_fmt=raw /opt/stack/data/nova/instances/d35fe056-8279-479a-a673-6c61e5ec6933/disk 1073741824 {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/f45ce42723178ed0c3f545cbb0d68bcf7bb43056 --force-share --output=json" returned: 0 in 0.265s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG nova.virt.disk.api [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Checking if we can resize image /opt/stack/data/nova/instances/309c9c26-4a0f-45db-bb3a-595b19f3f627/disk. size=1073741824 {{(pid=70374) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/309c9c26-4a0f-45db-bb3a-595b19f3f627/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1,backing_fmt=raw /opt/stack/data/nova/instances/d35fe056-8279-479a-a673-6c61e5ec6933/disk 1073741824" returned: 0 in 0.181s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.403s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG nova.compute.manager [req-d676d539-77b2-4780-9efa-2aca9ff24935 req-972b4e2f-141c-425e-bb0b-e7c9ff9cdf13 service nova] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Received event network-vif-plugged-cc9444e6-2840-433c-b7c9-c0df023f6e43 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-d676d539-77b2-4780-9efa-2aca9ff24935 req-972b4e2f-141c-425e-bb0b-e7c9ff9cdf13 service nova] Acquiring lock "42f4c546-47e4-485b-be29-4081c7557bad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-d676d539-77b2-4780-9efa-2aca9ff24935 req-972b4e2f-141c-425e-bb0b-e7c9ff9cdf13 service nova] Lock "42f4c546-47e4-485b-be29-4081c7557bad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-d676d539-77b2-4780-9efa-2aca9ff24935 req-972b4e2f-141c-425e-bb0b-e7c9ff9cdf13 service nova] Lock "42f4c546-47e4-485b-be29-4081c7557bad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG nova.compute.manager [req-d676d539-77b2-4780-9efa-2aca9ff24935 req-972b4e2f-141c-425e-bb0b-e7c9ff9cdf13 service nova] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] No waiting events found dispatching network-vif-plugged-cc9444e6-2840-433c-b7c9-c0df023f6e43 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:30:49 user nova-compute[70374]: WARNING nova.compute.manager [req-d676d539-77b2-4780-9efa-2aca9ff24935 req-972b4e2f-141c-425e-bb0b-e7c9ff9cdf13 service nova] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Received unexpected event network-vif-plugged-cc9444e6-2840-433c-b7c9-c0df023f6e43 for instance with vm_state building and task_state spawning. Jul 27 09:30:49 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/309c9c26-4a0f-45db-bb3a-595b19f3f627/disk --force-share --output=json" returned: 0 in 0.174s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG nova.virt.disk.api [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Cannot resize image /opt/stack/data/nova/instances/309c9c26-4a0f-45db-bb3a-595b19f3f627/disk to a smaller size. {{(pid=70374) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG nova.objects.instance [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Lazy-loading 'migration_context' on Instance uuid 309c9c26-4a0f-45db-bb3a-595b19f3f627 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Created local disks {{(pid=70374) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Ensure instance console log exists: /opt/stack/data/nova/instances/309c9c26-4a0f-45db-bb3a-595b19f3f627/console.log {{(pid=70374) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG nova.compute.provider_tree [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Inventory has not changed in ProviderTree for provider: f7548644-4a09-4ad8-9aa6-6e05d85a9f5b {{(pid=70374) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG nova.scheduler.client.report [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Inventory has not changed for provider f7548644-4a09-4ad8-9aa6-6e05d85a9f5b based on inventory data: {'MEMORY_MB': {'total': 16011, 'reserved': 512, 'min_unit': 1, 'max_unit': 16011, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70374) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.199s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG nova.virt.disk.api [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Checking if we can resize image /opt/stack/data/nova/instances/d35fe056-8279-479a-a673-6c61e5ec6933/disk. size=1073741824 {{(pid=70374) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d35fe056-8279-479a-a673-6c61e5ec6933/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.276s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG nova.compute.manager [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Start building networks asynchronously for instance. {{(pid=70374) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG nova.compute.manager [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Allocating IP information in the background. {{(pid=70374) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG nova.network.neutron [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] allocate_for_instance() {{(pid=70374) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Jul 27 09:30:49 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Jul 27 09:30:49 user nova-compute[70374]: DEBUG nova.compute.manager [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Start building block device mappings for instance. {{(pid=70374) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG nova.network.neutron [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Successfully created port: 018c2bfb-405c-4bba-a54f-e8ea773e57bf {{(pid=70374) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d35fe056-8279-479a-a673-6c61e5ec6933/disk --force-share --output=json" returned: 0 in 0.321s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG nova.virt.disk.api [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Cannot resize image /opt/stack/data/nova/instances/d35fe056-8279-479a-a673-6c61e5ec6933/disk to a smaller size. {{(pid=70374) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG nova.objects.instance [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Lazy-loading 'migration_context' on Instance uuid d35fe056-8279-479a-a673-6c61e5ec6933 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Created local disks {{(pid=70374) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Ensure instance console log exists: /opt/stack/data/nova/instances/d35fe056-8279-479a-a673-6c61e5ec6933/console.log {{(pid=70374) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG nova.compute.manager [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Start spawning the instance on the hypervisor. {{(pid=70374) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Creating instance directory {{(pid=70374) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Jul 27 09:30:49 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Creating image(s) Jul 27 09:30:49 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Acquiring lock "/opt/stack/data/nova/instances/773917c6-56d7-4491-a760-05f51593b7f0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Lock "/opt/stack/data/nova/instances/773917c6-56d7-4491-a760-05f51593b7f0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Lock "/opt/stack/data/nova/instances/773917c6-56d7-4491-a760-05f51593b7f0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG nova.policy [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '459cb24159604669a118a1da67fbdf72', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd88292245b8d4bbfa07efc8084ae089c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70374) authorize /opt/stack/nova/nova/policy.py:203}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.169s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Acquiring lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:49 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:50 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.175s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:50 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1,backing_fmt=raw /opt/stack/data/nova/instances/773917c6-56d7-4491-a760-05f51593b7f0/disk 1073741824 {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:50 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1,backing_fmt=raw /opt/stack/data/nova/instances/773917c6-56d7-4491-a760-05f51593b7f0/disk 1073741824" returned: 0 in 0.075s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:50 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.262s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:50 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:50 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.170s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:50 user nova-compute[70374]: DEBUG nova.virt.disk.api [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Checking if we can resize image /opt/stack/data/nova/instances/773917c6-56d7-4491-a760-05f51593b7f0/disk. size=1073741824 {{(pid=70374) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Jul 27 09:30:50 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/773917c6-56d7-4491-a760-05f51593b7f0/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:50 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/773917c6-56d7-4491-a760-05f51593b7f0/disk --force-share --output=json" returned: 0 in 0.166s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:50 user nova-compute[70374]: DEBUG nova.virt.disk.api [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Cannot resize image /opt/stack/data/nova/instances/773917c6-56d7-4491-a760-05f51593b7f0/disk to a smaller size. {{(pid=70374) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Jul 27 09:30:50 user nova-compute[70374]: DEBUG nova.objects.instance [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Lazy-loading 'migration_context' on Instance uuid 773917c6-56d7-4491-a760-05f51593b7f0 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:30:50 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Created local disks {{(pid=70374) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Jul 27 09:30:50 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Ensure instance console log exists: /opt/stack/data/nova/instances/773917c6-56d7-4491-a760-05f51593b7f0/console.log {{(pid=70374) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Jul 27 09:30:50 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:50 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.002s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:50 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:50 user nova-compute[70374]: DEBUG nova.network.neutron [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Successfully updated port: 018c2bfb-405c-4bba-a54f-e8ea773e57bf {{(pid=70374) _update_port /opt/stack/nova/nova/network/neutron.py:586}} Jul 27 09:30:50 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Acquiring lock "refresh_cache-d35fe056-8279-479a-a673-6c61e5ec6933" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:30:50 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Acquired lock "refresh_cache-d35fe056-8279-479a-a673-6c61e5ec6933" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:30:50 user nova-compute[70374]: DEBUG nova.network.neutron [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Building network info cache for instance {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Jul 27 09:30:50 user nova-compute[70374]: DEBUG nova.compute.manager [req-91ea84f1-9609-4208-bfec-0b0d8dc62bc8 req-dbe3b311-80ba-4872-aa04-8c38e6b58ce8 service nova] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Received event network-changed-018c2bfb-405c-4bba-a54f-e8ea773e57bf {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:30:50 user nova-compute[70374]: DEBUG nova.compute.manager [req-91ea84f1-9609-4208-bfec-0b0d8dc62bc8 req-dbe3b311-80ba-4872-aa04-8c38e6b58ce8 service nova] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Refreshing instance network info cache due to event network-changed-018c2bfb-405c-4bba-a54f-e8ea773e57bf. {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Jul 27 09:30:50 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-91ea84f1-9609-4208-bfec-0b0d8dc62bc8 req-dbe3b311-80ba-4872-aa04-8c38e6b58ce8 service nova] Acquiring lock "refresh_cache-d35fe056-8279-479a-a673-6c61e5ec6933" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:30:50 user nova-compute[70374]: DEBUG nova.network.neutron [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Instance cache missing network info. {{(pid=70374) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.network.neutron [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Successfully updated port: 626b287e-22ee-49d7-8ec3-c1a0660bd5d8 {{(pid=70374) _update_port /opt/stack/nova/nova/network/neutron.py:586}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Acquiring lock "refresh_cache-309c9c26-4a0f-45db-bb3a-595b19f3f627" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Acquired lock "refresh_cache-309c9c26-4a0f-45db-bb3a-595b19f3f627" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.network.neutron [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Building network info cache for instance {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.virt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Emitting event Resumed> {{(pid=70374) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Jul 27 09:30:51 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] VM Resumed (Lifecycle Event) Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.compute.manager [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Instance event wait completed in 0 seconds for {{(pid=70374) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Guest created on hypervisor {{(pid=70374) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4411}} Jul 27 09:30:51 user nova-compute[70374]: INFO nova.virt.libvirt.driver [-] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Instance spawned successfully. Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:894}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.network.neutron [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Instance cache missing network info. {{(pid=70374) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.network.neutron [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Updating instance_info_cache with network_info: [{"id": "018c2bfb-405c-4bba-a54f-e8ea773e57bf", "address": "fa:16:3e:2d:07:78", "network": {"id": "aead18f5-45fa-4ce5-9c4a-086f9f9d07e6", "bridge": "br-int", "label": "tempest-TestVolumeSwap-383523410-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "2010f7a57b654bee8736f3ed8d805b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap018c2bfb-40", "ovs_interfaceid": "018c2bfb-405c-4bba-a54f-e8ea773e57bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70374) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Found default for hw_cdrom_bus of ide {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Found default for hw_disk_bus of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Found default for hw_input_bus of None {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Found default for hw_pointer_model of None {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Found default for hw_video_model of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Found default for hw_vif_model of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Releasing lock "refresh_cache-d35fe056-8279-479a-a673-6c61e5ec6933" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.compute.manager [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Instance network_info: |[{"id": "018c2bfb-405c-4bba-a54f-e8ea773e57bf", "address": "fa:16:3e:2d:07:78", "network": {"id": "aead18f5-45fa-4ce5-9c4a-086f9f9d07e6", "bridge": "br-int", "label": "tempest-TestVolumeSwap-383523410-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "2010f7a57b654bee8736f3ed8d805b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap018c2bfb-40", "ovs_interfaceid": "018c2bfb-405c-4bba-a54f-e8ea773e57bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70374) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-91ea84f1-9609-4208-bfec-0b0d8dc62bc8 req-dbe3b311-80ba-4872-aa04-8c38e6b58ce8 service nova] Acquired lock "refresh_cache-d35fe056-8279-479a-a673-6c61e5ec6933" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.network.neutron [req-91ea84f1-9609-4208-bfec-0b0d8dc62bc8 req-dbe3b311-80ba-4872-aa04-8c38e6b58ce8 service nova] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Refreshing network info cache for port 018c2bfb-405c-4bba-a54f-e8ea773e57bf {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Start _get_guest_xml network_info=[{"id": "018c2bfb-405c-4bba-a54f-e8ea773e57bf", "address": "fa:16:3e:2d:07:78", "network": {"id": "aead18f5-45fa-4ce5-9c4a-086f9f9d07e6", "bridge": "br-int", "label": "tempest-TestVolumeSwap-383523410-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "2010f7a57b654bee8736f3ed8d805b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap018c2bfb-40", "ovs_interfaceid": "018c2bfb-405c-4bba-a54f-e8ea773e57bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-07-27T09:26:27Z,direct_url=,disk_format='qcow2',id=35458adf-261a-4e0b-a4db-b243619b2394,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='37aff788807d4b2aafc10d355583f7c7',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-07-27T09:26:29Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'size': 0, 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'image_id': '35458adf-261a-4e0b-a4db-b243619b2394'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70374) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Jul 27 09:30:51 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] During sync_power_state the instance has a pending task (spawning). Skip. Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.virt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Emitting event Started> {{(pid=70374) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Jul 27 09:30:51 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] VM Started (Lifecycle Event) Jul 27 09:30:51 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:30:51 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Searching host: 'user' for CPU controller through CGroups V1... {{(pid=70374) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] CPU controller missing on host. {{(pid=70374) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Searching host: 'user' for CPU controller through CGroups V2... {{(pid=70374) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] CPU controller found on host. {{(pid=70374) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70374) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-07-27T09:26:16Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1',id=6,is_public=True,memory_mb=512,name='m1.tiny',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-07-27T09:26:27Z,direct_url=,disk_format='qcow2',id=35458adf-261a-4e0b-a4db-b243619b2394,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='37aff788807d4b2aafc10d355583f7c7',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-07-27T09:26:29Z,virtual_size=,visibility=), allow threads: True {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Flavor limits 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Image limits 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Flavor pref 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Image pref 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70374) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Got 1 possible topologies {{(pid=70374) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.virt.libvirt.vif [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-07-27T09:30:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeSwap-server-1780453082',display_name='tempest-TestVolumeSwap-server-1780453082',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(6),hidden=False,host='user',hostname='tempest-testvolumeswap-server-1780453082',id=9,image_ref='35458adf-261a-4e0b-a4db-b243619b2394',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEaNbtFtor1bqTgF+b2crgVHLS9gnhKdKp9dMQdiKNpBIvJQwJpmePSBuDiTyWesfiLJRcs+yKbdVF9O+syPsBVqVGKrGBXSpMVjRmnYBFtocxj+nET2lsNkoXtTK+YyzQ==',key_name='tempest-keypair-612516559',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2010f7a57b654bee8736f3ed8d805b2c',ramdisk_id='',reservation_id='r-d2e70j2v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='35458adf-261a-4e0b-a4db-b243619b2394',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-TestVolumeSwap-987419093',owner_user_name='tempest-TestVolumeSwap-987419093-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-07-27T09:30:48Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9597f10b87aa426b812375b1770d3095',uuid=d35fe056-8279-479a-a673-6c61e5ec6933,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "018c2bfb-405c-4bba-a54f-e8ea773e57bf", "address": "fa:16:3e:2d:07:78", "network": {"id": "aead18f5-45fa-4ce5-9c4a-086f9f9d07e6", "bridge": "br-int", "label": "tempest-TestVolumeSwap-383523410-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "2010f7a57b654bee8736f3ed8d805b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap018c2bfb-40", "ovs_interfaceid": "018c2bfb-405c-4bba-a54f-e8ea773e57bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70374) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Converting VIF {"id": "018c2bfb-405c-4bba-a54f-e8ea773e57bf", "address": "fa:16:3e:2d:07:78", "network": {"id": "aead18f5-45fa-4ce5-9c4a-086f9f9d07e6", "bridge": "br-int", "label": "tempest-TestVolumeSwap-383523410-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "2010f7a57b654bee8736f3ed8d805b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap018c2bfb-40", "ovs_interfaceid": "018c2bfb-405c-4bba-a54f-e8ea773e57bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:07:78,bridge_name='br-int',has_traffic_filtering=True,id=018c2bfb-405c-4bba-a54f-e8ea773e57bf,network=Network(aead18f5-45fa-4ce5-9c4a-086f9f9d07e6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap018c2bfb-40') {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.objects.instance [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Lazy-loading 'pci_devices' on Instance uuid d35fe056-8279-479a-a673-6c61e5ec6933 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.compute.manager [req-6e768fa6-3f82-4722-9d89-996be36124de req-0d92771b-ac1a-4d26-b788-0b0501d97cee service nova] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Received event network-vif-plugged-cc9444e6-2840-433c-b7c9-c0df023f6e43 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-6e768fa6-3f82-4722-9d89-996be36124de req-0d92771b-ac1a-4d26-b788-0b0501d97cee service nova] Acquiring lock "42f4c546-47e4-485b-be29-4081c7557bad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-6e768fa6-3f82-4722-9d89-996be36124de req-0d92771b-ac1a-4d26-b788-0b0501d97cee service nova] Lock "42f4c546-47e4-485b-be29-4081c7557bad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-6e768fa6-3f82-4722-9d89-996be36124de req-0d92771b-ac1a-4d26-b788-0b0501d97cee service nova] Lock "42f4c546-47e4-485b-be29-4081c7557bad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.compute.manager [req-6e768fa6-3f82-4722-9d89-996be36124de req-0d92771b-ac1a-4d26-b788-0b0501d97cee service nova] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] No waiting events found dispatching network-vif-plugged-cc9444e6-2840-433c-b7c9-c0df023f6e43 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:30:51 user nova-compute[70374]: WARNING nova.compute.manager [req-6e768fa6-3f82-4722-9d89-996be36124de req-0d92771b-ac1a-4d26-b788-0b0501d97cee service nova] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Received unexpected event network-vif-plugged-cc9444e6-2840-433c-b7c9-c0df023f6e43 for instance with vm_state building and task_state spawning. Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.compute.manager [req-6e768fa6-3f82-4722-9d89-996be36124de req-0d92771b-ac1a-4d26-b788-0b0501d97cee service nova] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Received event network-changed-626b287e-22ee-49d7-8ec3-c1a0660bd5d8 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.compute.manager [req-6e768fa6-3f82-4722-9d89-996be36124de req-0d92771b-ac1a-4d26-b788-0b0501d97cee service nova] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Refreshing instance network info cache due to event network-changed-626b287e-22ee-49d7-8ec3-c1a0660bd5d8. {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-6e768fa6-3f82-4722-9d89-996be36124de req-0d92771b-ac1a-4d26-b788-0b0501d97cee service nova] Acquiring lock "refresh_cache-309c9c26-4a0f-45db-bb3a-595b19f3f627" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.network.neutron [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Successfully created port: c8fbf870-e2d7-4ec6-be17-98f3d9315f75 {{(pid=70374) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70374) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] End _get_guest_xml xml= Jul 27 09:30:51 user nova-compute[70374]: d35fe056-8279-479a-a673-6c61e5ec6933 Jul 27 09:30:51 user nova-compute[70374]: instance-00000009 Jul 27 09:30:51 user nova-compute[70374]: 524288 Jul 27 09:30:51 user nova-compute[70374]: 1 Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: tempest-TestVolumeSwap-server-1780453082 Jul 27 09:30:51 user nova-compute[70374]: 2023-07-27 09:30:51 Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: 512 Jul 27 09:30:51 user nova-compute[70374]: 1 Jul 27 09:30:51 user nova-compute[70374]: 0 Jul 27 09:30:51 user nova-compute[70374]: 0 Jul 27 09:30:51 user nova-compute[70374]: 1 Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: tempest-TestVolumeSwap-987419093-project-member Jul 27 09:30:51 user nova-compute[70374]: tempest-TestVolumeSwap-987419093 Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: OpenStack Foundation Jul 27 09:30:51 user nova-compute[70374]: OpenStack Nova Jul 27 09:30:51 user nova-compute[70374]: 0.0.0 Jul 27 09:30:51 user nova-compute[70374]: d35fe056-8279-479a-a673-6c61e5ec6933 Jul 27 09:30:51 user nova-compute[70374]: d35fe056-8279-479a-a673-6c61e5ec6933 Jul 27 09:30:51 user nova-compute[70374]: Virtual Machine Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: hvm Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Nehalem Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: /dev/urandom Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: Jul 27 09:30:51 user nova-compute[70374]: {{(pid=70374) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.virt.libvirt.vif [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-07-27T09:30:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeSwap-server-1780453082',display_name='tempest-TestVolumeSwap-server-1780453082',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(6),hidden=False,host='user',hostname='tempest-testvolumeswap-server-1780453082',id=9,image_ref='35458adf-261a-4e0b-a4db-b243619b2394',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEaNbtFtor1bqTgF+b2crgVHLS9gnhKdKp9dMQdiKNpBIvJQwJpmePSBuDiTyWesfiLJRcs+yKbdVF9O+syPsBVqVGKrGBXSpMVjRmnYBFtocxj+nET2lsNkoXtTK+YyzQ==',key_name='tempest-keypair-612516559',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2010f7a57b654bee8736f3ed8d805b2c',ramdisk_id='',reservation_id='r-d2e70j2v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='35458adf-261a-4e0b-a4db-b243619b2394',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-TestVolumeSwap-987419093',owner_user_name='tempest-TestVolumeSwap-987419093-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-07-27T09:30:48Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9597f10b87aa426b812375b1770d3095',uuid=d35fe056-8279-479a-a673-6c61e5ec6933,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "018c2bfb-405c-4bba-a54f-e8ea773e57bf", "address": "fa:16:3e:2d:07:78", "network": {"id": "aead18f5-45fa-4ce5-9c4a-086f9f9d07e6", "bridge": "br-int", "label": "tempest-TestVolumeSwap-383523410-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "2010f7a57b654bee8736f3ed8d805b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap018c2bfb-40", "ovs_interfaceid": "018c2bfb-405c-4bba-a54f-e8ea773e57bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Converting VIF {"id": "018c2bfb-405c-4bba-a54f-e8ea773e57bf", "address": "fa:16:3e:2d:07:78", "network": {"id": "aead18f5-45fa-4ce5-9c4a-086f9f9d07e6", "bridge": "br-int", "label": "tempest-TestVolumeSwap-383523410-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "2010f7a57b654bee8736f3ed8d805b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap018c2bfb-40", "ovs_interfaceid": "018c2bfb-405c-4bba-a54f-e8ea773e57bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:07:78,bridge_name='br-int',has_traffic_filtering=True,id=018c2bfb-405c-4bba-a54f-e8ea773e57bf,network=Network(aead18f5-45fa-4ce5-9c4a-086f9f9d07e6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap018c2bfb-40') {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG os_vif [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:07:78,bridge_name='br-int',has_traffic_filtering=True,id=018c2bfb-405c-4bba-a54f-e8ea773e57bf,network=Network(aead18f5-45fa-4ce5-9c4a-086f9f9d07e6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap018c2bfb-40') {{(pid=70374) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap018c2bfb-40, may_exist=True, interface_attrs={}) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap018c2bfb-40, col_values=(('external_ids', {'iface-id': '018c2bfb-405c-4bba-a54f-e8ea773e57bf', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2d:07:78', 'vm-uuid': 'd35fe056-8279-479a-a673-6c61e5ec6933'}),), if_exists=True) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:51 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] During sync_power_state the instance has a pending task (spawning). Skip. Jul 27 09:30:51 user nova-compute[70374]: INFO nova.compute.manager [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Took 11.31 seconds to spawn the instance on the hypervisor. Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.compute.manager [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:51 user nova-compute[70374]: INFO os_vif [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:07:78,bridge_name='br-int',has_traffic_filtering=True,id=018c2bfb-405c-4bba-a54f-e8ea773e57bf,network=Network(aead18f5-45fa-4ce5-9c4a-086f9f9d07e6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap018c2bfb-40') Jul 27 09:30:51 user nova-compute[70374]: INFO nova.compute.manager [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Took 12.75 seconds to build instance. Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] No BDM found with device name vda, not building metadata. {{(pid=70374) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] No VIF found with MAC fa:16:3e:2d:07:78, not building metadata {{(pid=70374) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-42453f95-88b3-4b3c-82a7-e81b0135f19c tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Lock "42f4c546-47e4-485b-be29-4081c7557bad" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.907s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.network.neutron [req-91ea84f1-9609-4208-bfec-0b0d8dc62bc8 req-dbe3b311-80ba-4872-aa04-8c38e6b58ce8 service nova] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Updated VIF entry in instance network info cache for port 018c2bfb-405c-4bba-a54f-e8ea773e57bf. {{(pid=70374) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG nova.network.neutron [req-91ea84f1-9609-4208-bfec-0b0d8dc62bc8 req-dbe3b311-80ba-4872-aa04-8c38e6b58ce8 service nova] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Updating instance_info_cache with network_info: [{"id": "018c2bfb-405c-4bba-a54f-e8ea773e57bf", "address": "fa:16:3e:2d:07:78", "network": {"id": "aead18f5-45fa-4ce5-9c4a-086f9f9d07e6", "bridge": "br-int", "label": "tempest-TestVolumeSwap-383523410-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "2010f7a57b654bee8736f3ed8d805b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap018c2bfb-40", "ovs_interfaceid": "018c2bfb-405c-4bba-a54f-e8ea773e57bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:30:51 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-91ea84f1-9609-4208-bfec-0b0d8dc62bc8 req-dbe3b311-80ba-4872-aa04-8c38e6b58ce8 service nova] Releasing lock "refresh_cache-d35fe056-8279-479a-a673-6c61e5ec6933" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG nova.network.neutron [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Updating instance_info_cache with network_info: [{"id": "626b287e-22ee-49d7-8ec3-c1a0660bd5d8", "address": "fa:16:3e:57:6c:c6", "network": {"id": "f6f1459d-22db-4bf7-8b7b-d6dec4fa18d5", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-2084493807-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cdd638e5400740279443a374e3e570d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap626b287e-22", "ovs_interfaceid": "626b287e-22ee-49d7-8ec3-c1a0660bd5d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Releasing lock "refresh_cache-309c9c26-4a0f-45db-bb3a-595b19f3f627" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG nova.compute.manager [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Instance network_info: |[{"id": "626b287e-22ee-49d7-8ec3-c1a0660bd5d8", "address": "fa:16:3e:57:6c:c6", "network": {"id": "f6f1459d-22db-4bf7-8b7b-d6dec4fa18d5", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-2084493807-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cdd638e5400740279443a374e3e570d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap626b287e-22", "ovs_interfaceid": "626b287e-22ee-49d7-8ec3-c1a0660bd5d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70374) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-6e768fa6-3f82-4722-9d89-996be36124de req-0d92771b-ac1a-4d26-b788-0b0501d97cee service nova] Acquired lock "refresh_cache-309c9c26-4a0f-45db-bb3a-595b19f3f627" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG nova.network.neutron [req-6e768fa6-3f82-4722-9d89-996be36124de req-0d92771b-ac1a-4d26-b788-0b0501d97cee service nova] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Refreshing network info cache for port 626b287e-22ee-49d7-8ec3-c1a0660bd5d8 {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Start _get_guest_xml network_info=[{"id": "626b287e-22ee-49d7-8ec3-c1a0660bd5d8", "address": "fa:16:3e:57:6c:c6", "network": {"id": "f6f1459d-22db-4bf7-8b7b-d6dec4fa18d5", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-2084493807-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cdd638e5400740279443a374e3e570d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap626b287e-22", "ovs_interfaceid": "626b287e-22ee-49d7-8ec3-c1a0660bd5d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'scsi', 'cdrom_bus': 'scsi', 'mapping': {'root': {'bus': 'scsi', 'dev': 'sda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'scsi', 'dev': 'sda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'scsi', 'dev': 'sdb', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-07-27T09:30:30Z,direct_url=,disk_format='qcow2',id=726e1210-3ce3-486f-9417-95adaf9ac235,min_disk=0,min_ram=0,name='',owner='f6e8c1f6841a4a30857193d07747038e',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-07-27T09:30:34Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/sda', 'image': [{'device_name': '/dev/sda', 'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'size': 0, 'encryption_options': None, 'disk_bus': 'scsi', 'encryption_secret_uuid': None, 'image_id': '726e1210-3ce3-486f-9417-95adaf9ac235'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70374) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Jul 27 09:30:52 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:30:52 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:30:52 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Searching host: 'user' for CPU controller through CGroups V1... {{(pid=70374) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] CPU controller missing on host. {{(pid=70374) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Searching host: 'user' for CPU controller through CGroups V2... {{(pid=70374) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] CPU controller found on host. {{(pid=70374) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70374) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-07-27T09:26:16Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1',id=6,is_public=True,memory_mb=512,name='m1.tiny',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-07-27T09:30:30Z,direct_url=,disk_format='qcow2',id=726e1210-3ce3-486f-9417-95adaf9ac235,min_disk=0,min_ram=0,name='',owner='f6e8c1f6841a4a30857193d07747038e',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-07-27T09:30:34Z,virtual_size=,visibility=), allow threads: True {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Flavor limits 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Image limits 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Flavor pref 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Image pref 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70374) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Got 1 possible topologies {{(pid=70374) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG nova.virt.libvirt.vif [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2023-07-27T09:30:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-860974169',display_name='tempest-AttachSCSIVolumeTestJSON-server-860974169',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(6),hidden=False,host='user',hostname='tempest-attachscsivolumetestjson-server-860974169',id=8,image_ref='726e1210-3ce3-486f-9417-95adaf9ac235',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK2UxiY9xja9j2e+bYNFRInNFJS7DjB/7i9A7rll9a0HowWQT/92Qp4K1SO60+YclaZAVgDJD7MloBXQCsn+BhUVizXOve/ao95EzmtZUBSUTgLTVXF8O7s8qW7tjuxGXQ==',key_name='tempest-keypair-936823846',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cdd638e5400740279443a374e3e570d4',ramdisk_id='',reservation_id='r-010r1p2b',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='726e1210-3ce3-486f-9417-95adaf9ac235',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_machine_type='pc',image_hw_scsi_model='virtio-scsi',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachSCSIVolumeTestJSON-284147988',owner_user_name='tempest-AttachSCSIVolumeTestJSON-284147988-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-07-27T09:30:47Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='103aa251c26c4987814bc5973d86e601',uuid=309c9c26-4a0f-45db-bb3a-595b19f3f627,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "626b287e-22ee-49d7-8ec3-c1a0660bd5d8", "address": "fa:16:3e:57:6c:c6", "network": {"id": "f6f1459d-22db-4bf7-8b7b-d6dec4fa18d5", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-2084493807-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cdd638e5400740279443a374e3e570d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap626b287e-22", "ovs_interfaceid": "626b287e-22ee-49d7-8ec3-c1a0660bd5d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70374) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Converting VIF {"id": "626b287e-22ee-49d7-8ec3-c1a0660bd5d8", "address": "fa:16:3e:57:6c:c6", "network": {"id": "f6f1459d-22db-4bf7-8b7b-d6dec4fa18d5", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-2084493807-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cdd638e5400740279443a374e3e570d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap626b287e-22", "ovs_interfaceid": "626b287e-22ee-49d7-8ec3-c1a0660bd5d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:6c:c6,bridge_name='br-int',has_traffic_filtering=True,id=626b287e-22ee-49d7-8ec3-c1a0660bd5d8,network=Network(f6f1459d-22db-4bf7-8b7b-d6dec4fa18d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap626b287e-22') {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG nova.objects.instance [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Lazy-loading 'pci_devices' on Instance uuid 309c9c26-4a0f-45db-bb3a-595b19f3f627 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] End _get_guest_xml xml= Jul 27 09:30:52 user nova-compute[70374]: 309c9c26-4a0f-45db-bb3a-595b19f3f627 Jul 27 09:30:52 user nova-compute[70374]: instance-00000008 Jul 27 09:30:52 user nova-compute[70374]: 524288 Jul 27 09:30:52 user nova-compute[70374]: 1 Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: tempest-AttachSCSIVolumeTestJSON-server-860974169 Jul 27 09:30:52 user nova-compute[70374]: 2023-07-27 09:30:52 Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: 512 Jul 27 09:30:52 user nova-compute[70374]: 1 Jul 27 09:30:52 user nova-compute[70374]: 0 Jul 27 09:30:52 user nova-compute[70374]: 0 Jul 27 09:30:52 user nova-compute[70374]: 1 Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: tempest-AttachSCSIVolumeTestJSON-284147988-project-member Jul 27 09:30:52 user nova-compute[70374]: tempest-AttachSCSIVolumeTestJSON-284147988 Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: OpenStack Foundation Jul 27 09:30:52 user nova-compute[70374]: OpenStack Nova Jul 27 09:30:52 user nova-compute[70374]: 0.0.0 Jul 27 09:30:52 user nova-compute[70374]: 309c9c26-4a0f-45db-bb3a-595b19f3f627 Jul 27 09:30:52 user nova-compute[70374]: 309c9c26-4a0f-45db-bb3a-595b19f3f627 Jul 27 09:30:52 user nova-compute[70374]: Virtual Machine Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: hvm Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Nehalem Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]:
Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]:
Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: /dev/urandom Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: Jul 27 09:30:52 user nova-compute[70374]: {{(pid=70374) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG nova.virt.libvirt.vif [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2023-07-27T09:30:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-860974169',display_name='tempest-AttachSCSIVolumeTestJSON-server-860974169',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(6),hidden=False,host='user',hostname='tempest-attachscsivolumetestjson-server-860974169',id=8,image_ref='726e1210-3ce3-486f-9417-95adaf9ac235',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK2UxiY9xja9j2e+bYNFRInNFJS7DjB/7i9A7rll9a0HowWQT/92Qp4K1SO60+YclaZAVgDJD7MloBXQCsn+BhUVizXOve/ao95EzmtZUBSUTgLTVXF8O7s8qW7tjuxGXQ==',key_name='tempest-keypair-936823846',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cdd638e5400740279443a374e3e570d4',ramdisk_id='',reservation_id='r-010r1p2b',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='726e1210-3ce3-486f-9417-95adaf9ac235',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_machine_type='pc',image_hw_scsi_model='virtio-scsi',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachSCSIVolumeTestJSON-284147988',owner_user_name='tempest-AttachSCSIVolumeTestJSON-284147988-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-07-27T09:30:47Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='103aa251c26c4987814bc5973d86e601',uuid=309c9c26-4a0f-45db-bb3a-595b19f3f627,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "626b287e-22ee-49d7-8ec3-c1a0660bd5d8", "address": "fa:16:3e:57:6c:c6", "network": {"id": "f6f1459d-22db-4bf7-8b7b-d6dec4fa18d5", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-2084493807-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cdd638e5400740279443a374e3e570d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap626b287e-22", "ovs_interfaceid": "626b287e-22ee-49d7-8ec3-c1a0660bd5d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Converting VIF {"id": "626b287e-22ee-49d7-8ec3-c1a0660bd5d8", "address": "fa:16:3e:57:6c:c6", "network": {"id": "f6f1459d-22db-4bf7-8b7b-d6dec4fa18d5", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-2084493807-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cdd638e5400740279443a374e3e570d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap626b287e-22", "ovs_interfaceid": "626b287e-22ee-49d7-8ec3-c1a0660bd5d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:6c:c6,bridge_name='br-int',has_traffic_filtering=True,id=626b287e-22ee-49d7-8ec3-c1a0660bd5d8,network=Network(f6f1459d-22db-4bf7-8b7b-d6dec4fa18d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap626b287e-22') {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG os_vif [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:6c:c6,bridge_name='br-int',has_traffic_filtering=True,id=626b287e-22ee-49d7-8ec3-c1a0660bd5d8,network=Network(f6f1459d-22db-4bf7-8b7b-d6dec4fa18d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap626b287e-22') {{(pid=70374) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap626b287e-22, may_exist=True, interface_attrs={}) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap626b287e-22, col_values=(('external_ids', {'iface-id': '626b287e-22ee-49d7-8ec3-c1a0660bd5d8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:57:6c:c6', 'vm-uuid': '309c9c26-4a0f-45db-bb3a-595b19f3f627'}),), if_exists=True) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:52 user nova-compute[70374]: INFO os_vif [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:6c:c6,bridge_name='br-int',has_traffic_filtering=True,id=626b287e-22ee-49d7-8ec3-c1a0660bd5d8,network=Network(f6f1459d-22db-4bf7-8b7b-d6dec4fa18d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap626b287e-22') Jul 27 09:30:52 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] No BDM found with device name sda, not building metadata. {{(pid=70374) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] No BDM found with device name sdb, not building metadata. {{(pid=70374) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] No VIF found with MAC fa:16:3e:57:6c:c6, not building metadata {{(pid=70374) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Jul 27 09:30:52 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Using config drive Jul 27 09:30:52 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Acquiring lock "b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Lock "b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG nova.compute.manager [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Starting instance... {{(pid=70374) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:52 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70374) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Jul 27 09:30:52 user nova-compute[70374]: INFO nova.compute.claims [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Claim successful on node user Jul 27 09:30:52 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Creating config drive at /opt/stack/data/nova/instances/309c9c26-4a0f-45db-bb3a-595b19f3f627/disk.config Jul 27 09:30:52 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Running cmd (subprocess): genisoimage -o /opt/stack/data/nova/instances/309c9c26-4a0f-45db-bb3a-595b19f3f627/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 0.0.0 -quiet -J -r -V config-2 /tmp/tmp_r1bz5oe {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:53 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] CMD "genisoimage -o /opt/stack/data/nova/instances/309c9c26-4a0f-45db-bb3a-595b19f3f627/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 0.0.0 -quiet -J -r -V config-2 /tmp/tmp_r1bz5oe" returned: 0 in 0.089s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:53 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:53 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:53 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:53 user nova-compute[70374]: DEBUG nova.network.neutron [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Successfully updated port: c8fbf870-e2d7-4ec6-be17-98f3d9315f75 {{(pid=70374) _update_port /opt/stack/nova/nova/network/neutron.py:586}} Jul 27 09:30:53 user nova-compute[70374]: DEBUG nova.network.neutron [req-6e768fa6-3f82-4722-9d89-996be36124de req-0d92771b-ac1a-4d26-b788-0b0501d97cee service nova] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Updated VIF entry in instance network info cache for port 626b287e-22ee-49d7-8ec3-c1a0660bd5d8. {{(pid=70374) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Jul 27 09:30:53 user nova-compute[70374]: DEBUG nova.network.neutron [req-6e768fa6-3f82-4722-9d89-996be36124de req-0d92771b-ac1a-4d26-b788-0b0501d97cee service nova] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Updating instance_info_cache with network_info: [{"id": "626b287e-22ee-49d7-8ec3-c1a0660bd5d8", "address": "fa:16:3e:57:6c:c6", "network": {"id": "f6f1459d-22db-4bf7-8b7b-d6dec4fa18d5", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-2084493807-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cdd638e5400740279443a374e3e570d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap626b287e-22", "ovs_interfaceid": "626b287e-22ee-49d7-8ec3-c1a0660bd5d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:30:53 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Acquiring lock "refresh_cache-773917c6-56d7-4491-a760-05f51593b7f0" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:30:53 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Acquired lock "refresh_cache-773917c6-56d7-4491-a760-05f51593b7f0" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:30:53 user nova-compute[70374]: DEBUG nova.network.neutron [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Building network info cache for instance {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Jul 27 09:30:53 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-6e768fa6-3f82-4722-9d89-996be36124de req-0d92771b-ac1a-4d26-b788-0b0501d97cee service nova] Releasing lock "refresh_cache-309c9c26-4a0f-45db-bb3a-595b19f3f627" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:30:53 user nova-compute[70374]: DEBUG nova.compute.manager [req-96b561ef-f164-4582-ba93-f7c38e709268 req-3a319eb8-de57-480f-824a-32735803d19b service nova] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Received event network-changed-c8fbf870-e2d7-4ec6-be17-98f3d9315f75 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:30:53 user nova-compute[70374]: DEBUG nova.compute.manager [req-96b561ef-f164-4582-ba93-f7c38e709268 req-3a319eb8-de57-480f-824a-32735803d19b service nova] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Refreshing instance network info cache due to event network-changed-c8fbf870-e2d7-4ec6-be17-98f3d9315f75. {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Jul 27 09:30:53 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-96b561ef-f164-4582-ba93-f7c38e709268 req-3a319eb8-de57-480f-824a-32735803d19b service nova] Acquiring lock "refresh_cache-773917c6-56d7-4491-a760-05f51593b7f0" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:30:53 user nova-compute[70374]: DEBUG nova.network.neutron [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Instance cache missing network info. {{(pid=70374) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Jul 27 09:30:53 user nova-compute[70374]: DEBUG nova.compute.provider_tree [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Inventory has not changed in ProviderTree for provider: f7548644-4a09-4ad8-9aa6-6e05d85a9f5b {{(pid=70374) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Jul 27 09:30:53 user nova-compute[70374]: DEBUG nova.scheduler.client.report [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Inventory has not changed for provider f7548644-4a09-4ad8-9aa6-6e05d85a9f5b based on inventory data: {'MEMORY_MB': {'total': 16011, 'reserved': 512, 'min_unit': 1, 'max_unit': 16011, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70374) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Jul 27 09:30:53 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.936s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:53 user nova-compute[70374]: DEBUG nova.compute.manager [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Start building networks asynchronously for instance. {{(pid=70374) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Jul 27 09:30:53 user nova-compute[70374]: DEBUG nova.compute.manager [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Allocating IP information in the background. {{(pid=70374) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} Jul 27 09:30:53 user nova-compute[70374]: DEBUG nova.network.neutron [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] allocate_for_instance() {{(pid=70374) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Jul 27 09:30:53 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Jul 27 09:30:53 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:53 user nova-compute[70374]: DEBUG nova.compute.manager [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Start building block device mappings for instance. {{(pid=70374) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Jul 27 09:30:53 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:53 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG nova.compute.manager [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Start spawning the instance on the hypervisor. {{(pid=70374) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Creating instance directory {{(pid=70374) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Jul 27 09:30:54 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Creating image(s) Jul 27 09:30:54 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Acquiring lock "/opt/stack/data/nova/instances/b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Lock "/opt/stack/data/nova/instances/b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Lock "/opt/stack/data/nova/instances/b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.212s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Acquiring lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.003s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG nova.policy [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6d33f8cd041046c18af25f56b63b6bb5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'df3e52a41c1847b199e6dcd09b676fba', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70374) authorize /opt/stack/nova/nova/policy.py:203}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.155s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1,backing_fmt=raw /opt/stack/data/nova/instances/b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06/disk 1073741824 {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1,backing_fmt=raw /opt/stack/data/nova/instances/b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06/disk 1073741824" returned: 0 in 0.059s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.224s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG nova.network.neutron [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Updating instance_info_cache with network_info: [{"id": "c8fbf870-e2d7-4ec6-be17-98f3d9315f75", "address": "fa:16:3e:19:0c:8d", "network": {"id": "efa0e322-091b-4b62-b5f9-8a09fa192696", "bridge": "br-int", "label": "tempest-VolumesActionsTest-681513154-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d88292245b8d4bbfa07efc8084ae089c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8fbf870-e2", "ovs_interfaceid": "c8fbf870-e2d7-4ec6-be17-98f3d9315f75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Releasing lock "refresh_cache-773917c6-56d7-4491-a760-05f51593b7f0" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG nova.compute.manager [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Instance network_info: |[{"id": "c8fbf870-e2d7-4ec6-be17-98f3d9315f75", "address": "fa:16:3e:19:0c:8d", "network": {"id": "efa0e322-091b-4b62-b5f9-8a09fa192696", "bridge": "br-int", "label": "tempest-VolumesActionsTest-681513154-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d88292245b8d4bbfa07efc8084ae089c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8fbf870-e2", "ovs_interfaceid": "c8fbf870-e2d7-4ec6-be17-98f3d9315f75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70374) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-96b561ef-f164-4582-ba93-f7c38e709268 req-3a319eb8-de57-480f-824a-32735803d19b service nova] Acquired lock "refresh_cache-773917c6-56d7-4491-a760-05f51593b7f0" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG nova.network.neutron [req-96b561ef-f164-4582-ba93-f7c38e709268 req-3a319eb8-de57-480f-824a-32735803d19b service nova] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Refreshing network info cache for port c8fbf870-e2d7-4ec6-be17-98f3d9315f75 {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Start _get_guest_xml network_info=[{"id": "c8fbf870-e2d7-4ec6-be17-98f3d9315f75", "address": "fa:16:3e:19:0c:8d", "network": {"id": "efa0e322-091b-4b62-b5f9-8a09fa192696", "bridge": "br-int", "label": "tempest-VolumesActionsTest-681513154-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d88292245b8d4bbfa07efc8084ae089c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8fbf870-e2", "ovs_interfaceid": "c8fbf870-e2d7-4ec6-be17-98f3d9315f75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-07-27T09:26:27Z,direct_url=,disk_format='qcow2',id=35458adf-261a-4e0b-a4db-b243619b2394,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='37aff788807d4b2aafc10d355583f7c7',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-07-27T09:26:29Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'size': 0, 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'image_id': '35458adf-261a-4e0b-a4db-b243619b2394'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70374) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Jul 27 09:30:54 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:30:54 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:30:54 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Searching host: 'user' for CPU controller through CGroups V1... {{(pid=70374) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] CPU controller missing on host. {{(pid=70374) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Searching host: 'user' for CPU controller through CGroups V2... {{(pid=70374) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] CPU controller found on host. {{(pid=70374) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70374) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-07-27T09:26:16Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1',id=6,is_public=True,memory_mb=512,name='m1.tiny',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-07-27T09:26:27Z,direct_url=,disk_format='qcow2',id=35458adf-261a-4e0b-a4db-b243619b2394,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='37aff788807d4b2aafc10d355583f7c7',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-07-27T09:26:29Z,virtual_size=,visibility=), allow threads: True {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Flavor limits 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Image limits 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Flavor pref 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Image pref 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70374) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Got 1 possible topologies {{(pid=70374) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG nova.virt.libvirt.vif [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-07-27T09:30:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-1574025589',display_name='tempest-VolumesActionsTest-instance-1574025589',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(6),hidden=False,host='user',hostname='tempest-volumesactionstest-instance-1574025589',id=10,image_ref='35458adf-261a-4e0b-a4db-b243619b2394',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d88292245b8d4bbfa07efc8084ae089c',ramdisk_id='',reservation_id='r-dbzwfifo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='35458adf-261a-4e0b-a4db-b243619b2394',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesActionsTest-2082395054',owner_user_name='tempest-VolumesActionsTest-2082395054-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-07-27T09:30:50Z,user_data=None,user_id='459cb24159604669a118a1da67fbdf72',uuid=773917c6-56d7-4491-a760-05f51593b7f0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c8fbf870-e2d7-4ec6-be17-98f3d9315f75", "address": "fa:16:3e:19:0c:8d", "network": {"id": "efa0e322-091b-4b62-b5f9-8a09fa192696", "bridge": "br-int", "label": "tempest-VolumesActionsTest-681513154-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d88292245b8d4bbfa07efc8084ae089c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8fbf870-e2", "ovs_interfaceid": "c8fbf870-e2d7-4ec6-be17-98f3d9315f75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70374) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Converting VIF {"id": "c8fbf870-e2d7-4ec6-be17-98f3d9315f75", "address": "fa:16:3e:19:0c:8d", "network": {"id": "efa0e322-091b-4b62-b5f9-8a09fa192696", "bridge": "br-int", "label": "tempest-VolumesActionsTest-681513154-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d88292245b8d4bbfa07efc8084ae089c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8fbf870-e2", "ovs_interfaceid": "c8fbf870-e2d7-4ec6-be17-98f3d9315f75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:0c:8d,bridge_name='br-int',has_traffic_filtering=True,id=c8fbf870-e2d7-4ec6-be17-98f3d9315f75,network=Network(efa0e322-091b-4b62-b5f9-8a09fa192696),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8fbf870-e2') {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG nova.objects.instance [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Lazy-loading 'pci_devices' on Instance uuid 773917c6-56d7-4491-a760-05f51593b7f0 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.193s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG nova.virt.disk.api [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Checking if we can resize image /opt/stack/data/nova/instances/b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06/disk. size=1073741824 {{(pid=70374) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] End _get_guest_xml xml= Jul 27 09:30:54 user nova-compute[70374]: 773917c6-56d7-4491-a760-05f51593b7f0 Jul 27 09:30:54 user nova-compute[70374]: instance-0000000a Jul 27 09:30:54 user nova-compute[70374]: 524288 Jul 27 09:30:54 user nova-compute[70374]: 1 Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: tempest-VolumesActionsTest-instance-1574025589 Jul 27 09:30:54 user nova-compute[70374]: 2023-07-27 09:30:54 Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: 512 Jul 27 09:30:54 user nova-compute[70374]: 1 Jul 27 09:30:54 user nova-compute[70374]: 0 Jul 27 09:30:54 user nova-compute[70374]: 0 Jul 27 09:30:54 user nova-compute[70374]: 1 Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: tempest-VolumesActionsTest-2082395054-project-member Jul 27 09:30:54 user nova-compute[70374]: tempest-VolumesActionsTest-2082395054 Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: OpenStack Foundation Jul 27 09:30:54 user nova-compute[70374]: OpenStack Nova Jul 27 09:30:54 user nova-compute[70374]: 0.0.0 Jul 27 09:30:54 user nova-compute[70374]: 773917c6-56d7-4491-a760-05f51593b7f0 Jul 27 09:30:54 user nova-compute[70374]: 773917c6-56d7-4491-a760-05f51593b7f0 Jul 27 09:30:54 user nova-compute[70374]: Virtual Machine Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: hvm Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Nehalem Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: /dev/urandom Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: Jul 27 09:30:54 user nova-compute[70374]: {{(pid=70374) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG nova.virt.libvirt.vif [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-07-27T09:30:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-1574025589',display_name='tempest-VolumesActionsTest-instance-1574025589',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(6),hidden=False,host='user',hostname='tempest-volumesactionstest-instance-1574025589',id=10,image_ref='35458adf-261a-4e0b-a4db-b243619b2394',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d88292245b8d4bbfa07efc8084ae089c',ramdisk_id='',reservation_id='r-dbzwfifo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='35458adf-261a-4e0b-a4db-b243619b2394',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesActionsTest-2082395054',owner_user_name='tempest-VolumesActionsTest-2082395054-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-07-27T09:30:50Z,user_data=None,user_id='459cb24159604669a118a1da67fbdf72',uuid=773917c6-56d7-4491-a760-05f51593b7f0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c8fbf870-e2d7-4ec6-be17-98f3d9315f75", "address": "fa:16:3e:19:0c:8d", "network": {"id": "efa0e322-091b-4b62-b5f9-8a09fa192696", "bridge": "br-int", "label": "tempest-VolumesActionsTest-681513154-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d88292245b8d4bbfa07efc8084ae089c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8fbf870-e2", "ovs_interfaceid": "c8fbf870-e2d7-4ec6-be17-98f3d9315f75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Converting VIF {"id": "c8fbf870-e2d7-4ec6-be17-98f3d9315f75", "address": "fa:16:3e:19:0c:8d", "network": {"id": "efa0e322-091b-4b62-b5f9-8a09fa192696", "bridge": "br-int", "label": "tempest-VolumesActionsTest-681513154-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d88292245b8d4bbfa07efc8084ae089c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8fbf870-e2", "ovs_interfaceid": "c8fbf870-e2d7-4ec6-be17-98f3d9315f75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:0c:8d,bridge_name='br-int',has_traffic_filtering=True,id=c8fbf870-e2d7-4ec6-be17-98f3d9315f75,network=Network(efa0e322-091b-4b62-b5f9-8a09fa192696),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8fbf870-e2') {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG os_vif [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:0c:8d,bridge_name='br-int',has_traffic_filtering=True,id=c8fbf870-e2d7-4ec6-be17-98f3d9315f75,network=Network(efa0e322-091b-4b62-b5f9-8a09fa192696),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8fbf870-e2') {{(pid=70374) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc8fbf870-e2, may_exist=True, interface_attrs={}) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc8fbf870-e2, col_values=(('external_ids', {'iface-id': 'c8fbf870-e2d7-4ec6-be17-98f3d9315f75', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:19:0c:8d', 'vm-uuid': '773917c6-56d7-4491-a760-05f51593b7f0'}),), if_exists=True) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:54 user nova-compute[70374]: INFO os_vif [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:0c:8d,bridge_name='br-int',has_traffic_filtering=True,id=c8fbf870-e2d7-4ec6-be17-98f3d9315f75,network=Network(efa0e322-091b-4b62-b5f9-8a09fa192696),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8fbf870-e2') Jul 27 09:30:54 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06/disk --force-share --output=json" returned: 0 in 0.190s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG nova.virt.disk.api [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Cannot resize image /opt/stack/data/nova/instances/b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06/disk to a smaller size. {{(pid=70374) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG nova.objects.instance [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Lazy-loading 'migration_context' on Instance uuid b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Created local disks {{(pid=70374) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Ensure instance console log exists: /opt/stack/data/nova/instances/b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06/console.log {{(pid=70374) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:54 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:55 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:55 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] No BDM found with device name vda, not building metadata. {{(pid=70374) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Jul 27 09:30:55 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] No VIF found with MAC fa:16:3e:19:0c:8d, not building metadata {{(pid=70374) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Jul 27 09:30:55 user nova-compute[70374]: DEBUG nova.compute.manager [req-41b5b2c7-2041-4659-9ed7-2b00bd63d498 req-4d82643f-38f2-4ac0-9806-d1ebbd1f85d3 service nova] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Received event network-vif-plugged-626b287e-22ee-49d7-8ec3-c1a0660bd5d8 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:30:55 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-41b5b2c7-2041-4659-9ed7-2b00bd63d498 req-4d82643f-38f2-4ac0-9806-d1ebbd1f85d3 service nova] Acquiring lock "309c9c26-4a0f-45db-bb3a-595b19f3f627-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:55 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-41b5b2c7-2041-4659-9ed7-2b00bd63d498 req-4d82643f-38f2-4ac0-9806-d1ebbd1f85d3 service nova] Lock "309c9c26-4a0f-45db-bb3a-595b19f3f627-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:55 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-41b5b2c7-2041-4659-9ed7-2b00bd63d498 req-4d82643f-38f2-4ac0-9806-d1ebbd1f85d3 service nova] Lock "309c9c26-4a0f-45db-bb3a-595b19f3f627-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:55 user nova-compute[70374]: DEBUG nova.compute.manager [req-41b5b2c7-2041-4659-9ed7-2b00bd63d498 req-4d82643f-38f2-4ac0-9806-d1ebbd1f85d3 service nova] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] No waiting events found dispatching network-vif-plugged-626b287e-22ee-49d7-8ec3-c1a0660bd5d8 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:30:55 user nova-compute[70374]: WARNING nova.compute.manager [req-41b5b2c7-2041-4659-9ed7-2b00bd63d498 req-4d82643f-38f2-4ac0-9806-d1ebbd1f85d3 service nova] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Received unexpected event network-vif-plugged-626b287e-22ee-49d7-8ec3-c1a0660bd5d8 for instance with vm_state building and task_state spawning. Jul 27 09:30:55 user nova-compute[70374]: DEBUG nova.network.neutron [req-96b561ef-f164-4582-ba93-f7c38e709268 req-3a319eb8-de57-480f-824a-32735803d19b service nova] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Updated VIF entry in instance network info cache for port c8fbf870-e2d7-4ec6-be17-98f3d9315f75. {{(pid=70374) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Jul 27 09:30:55 user nova-compute[70374]: DEBUG nova.network.neutron [req-96b561ef-f164-4582-ba93-f7c38e709268 req-3a319eb8-de57-480f-824a-32735803d19b service nova] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Updating instance_info_cache with network_info: [{"id": "c8fbf870-e2d7-4ec6-be17-98f3d9315f75", "address": "fa:16:3e:19:0c:8d", "network": {"id": "efa0e322-091b-4b62-b5f9-8a09fa192696", "bridge": "br-int", "label": "tempest-VolumesActionsTest-681513154-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d88292245b8d4bbfa07efc8084ae089c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8fbf870-e2", "ovs_interfaceid": "c8fbf870-e2d7-4ec6-be17-98f3d9315f75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:30:55 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-96b561ef-f164-4582-ba93-f7c38e709268 req-3a319eb8-de57-480f-824a-32735803d19b service nova] Releasing lock "refresh_cache-773917c6-56d7-4491-a760-05f51593b7f0" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:30:55 user nova-compute[70374]: DEBUG nova.compute.manager [req-d875d580-3fda-4c0d-8e28-6fa705d1f9f7 req-eb2d8c50-9709-4663-880d-f4701093cce1 service nova] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Received event network-vif-plugged-018c2bfb-405c-4bba-a54f-e8ea773e57bf {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:30:55 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-d875d580-3fda-4c0d-8e28-6fa705d1f9f7 req-eb2d8c50-9709-4663-880d-f4701093cce1 service nova] Acquiring lock "d35fe056-8279-479a-a673-6c61e5ec6933-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:55 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-d875d580-3fda-4c0d-8e28-6fa705d1f9f7 req-eb2d8c50-9709-4663-880d-f4701093cce1 service nova] Lock "d35fe056-8279-479a-a673-6c61e5ec6933-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:55 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-d875d580-3fda-4c0d-8e28-6fa705d1f9f7 req-eb2d8c50-9709-4663-880d-f4701093cce1 service nova] Lock "d35fe056-8279-479a-a673-6c61e5ec6933-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:55 user nova-compute[70374]: DEBUG nova.compute.manager [req-d875d580-3fda-4c0d-8e28-6fa705d1f9f7 req-eb2d8c50-9709-4663-880d-f4701093cce1 service nova] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] No waiting events found dispatching network-vif-plugged-018c2bfb-405c-4bba-a54f-e8ea773e57bf {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:30:55 user nova-compute[70374]: WARNING nova.compute.manager [req-d875d580-3fda-4c0d-8e28-6fa705d1f9f7 req-eb2d8c50-9709-4663-880d-f4701093cce1 service nova] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Received unexpected event network-vif-plugged-018c2bfb-405c-4bba-a54f-e8ea773e57bf for instance with vm_state building and task_state spawning. Jul 27 09:30:55 user nova-compute[70374]: DEBUG nova.compute.manager [req-d875d580-3fda-4c0d-8e28-6fa705d1f9f7 req-eb2d8c50-9709-4663-880d-f4701093cce1 service nova] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Received event network-vif-plugged-018c2bfb-405c-4bba-a54f-e8ea773e57bf {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:30:55 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-d875d580-3fda-4c0d-8e28-6fa705d1f9f7 req-eb2d8c50-9709-4663-880d-f4701093cce1 service nova] Acquiring lock "d35fe056-8279-479a-a673-6c61e5ec6933-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:55 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-d875d580-3fda-4c0d-8e28-6fa705d1f9f7 req-eb2d8c50-9709-4663-880d-f4701093cce1 service nova] Lock "d35fe056-8279-479a-a673-6c61e5ec6933-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:55 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-d875d580-3fda-4c0d-8e28-6fa705d1f9f7 req-eb2d8c50-9709-4663-880d-f4701093cce1 service nova] Lock "d35fe056-8279-479a-a673-6c61e5ec6933-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:55 user nova-compute[70374]: DEBUG nova.compute.manager [req-d875d580-3fda-4c0d-8e28-6fa705d1f9f7 req-eb2d8c50-9709-4663-880d-f4701093cce1 service nova] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] No waiting events found dispatching network-vif-plugged-018c2bfb-405c-4bba-a54f-e8ea773e57bf {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:30:55 user nova-compute[70374]: WARNING nova.compute.manager [req-d875d580-3fda-4c0d-8e28-6fa705d1f9f7 req-eb2d8c50-9709-4663-880d-f4701093cce1 service nova] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Received unexpected event network-vif-plugged-018c2bfb-405c-4bba-a54f-e8ea773e57bf for instance with vm_state building and task_state spawning. Jul 27 09:30:55 user nova-compute[70374]: DEBUG nova.network.neutron [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Successfully created port: 6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f {{(pid=70374) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} Jul 27 09:30:55 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:55 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:55 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:55 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:56 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:56 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:56 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.compute.manager [req-3a7e2013-dede-4b02-97c7-7e03567bee72 req-3fc03cec-2d90-4f3a-81c8-d2174b22afaf service nova] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Received event network-vif-plugged-c8fbf870-e2d7-4ec6-be17-98f3d9315f75 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-3a7e2013-dede-4b02-97c7-7e03567bee72 req-3fc03cec-2d90-4f3a-81c8-d2174b22afaf service nova] Acquiring lock "773917c6-56d7-4491-a760-05f51593b7f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-3a7e2013-dede-4b02-97c7-7e03567bee72 req-3fc03cec-2d90-4f3a-81c8-d2174b22afaf service nova] Lock "773917c6-56d7-4491-a760-05f51593b7f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-3a7e2013-dede-4b02-97c7-7e03567bee72 req-3fc03cec-2d90-4f3a-81c8-d2174b22afaf service nova] Lock "773917c6-56d7-4491-a760-05f51593b7f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.compute.manager [req-3a7e2013-dede-4b02-97c7-7e03567bee72 req-3fc03cec-2d90-4f3a-81c8-d2174b22afaf service nova] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] No waiting events found dispatching network-vif-plugged-c8fbf870-e2d7-4ec6-be17-98f3d9315f75 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:30:57 user nova-compute[70374]: WARNING nova.compute.manager [req-3a7e2013-dede-4b02-97c7-7e03567bee72 req-3fc03cec-2d90-4f3a-81c8-d2174b22afaf service nova] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Received unexpected event network-vif-plugged-c8fbf870-e2d7-4ec6-be17-98f3d9315f75 for instance with vm_state building and task_state spawning. Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.compute.manager [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Instance event wait completed in 0 seconds for {{(pid=70374) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Guest created on hypervisor {{(pid=70374) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4411}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.virt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Emitting event Resumed> {{(pid=70374) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Jul 27 09:30:57 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] VM Resumed (Lifecycle Event) Jul 27 09:30:57 user nova-compute[70374]: INFO nova.virt.libvirt.driver [-] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Instance spawned successfully. Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:894}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.network.neutron [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Successfully updated port: 6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f {{(pid=70374) _update_port /opt/stack/nova/nova/network/neutron.py:586}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70374) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Acquiring lock "refresh_cache-b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Acquired lock "refresh_cache-b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.network.neutron [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Building network info cache for instance {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Found default for hw_cdrom_bus of ide {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Found default for hw_disk_bus of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Found default for hw_input_bus of None {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Found default for hw_pointer_model of None {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Found default for hw_video_model of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Found default for hw_vif_model of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:57 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] During sync_power_state the instance has a pending task (spawning). Skip. Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.virt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Emitting event Started> {{(pid=70374) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Jul 27 09:30:57 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] VM Started (Lifecycle Event) Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70374) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.network.neutron [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Instance cache missing network info. {{(pid=70374) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Jul 27 09:30:57 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] During sync_power_state the instance has a pending task (spawning). Skip. Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.compute.manager [req-3084a275-9b54-4411-a6a8-135b42edd3be req-0a22029e-5f10-49f4-9675-231112e96867 service nova] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Received event network-vif-plugged-626b287e-22ee-49d7-8ec3-c1a0660bd5d8 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-3084a275-9b54-4411-a6a8-135b42edd3be req-0a22029e-5f10-49f4-9675-231112e96867 service nova] Acquiring lock "309c9c26-4a0f-45db-bb3a-595b19f3f627-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-3084a275-9b54-4411-a6a8-135b42edd3be req-0a22029e-5f10-49f4-9675-231112e96867 service nova] Lock "309c9c26-4a0f-45db-bb3a-595b19f3f627-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-3084a275-9b54-4411-a6a8-135b42edd3be req-0a22029e-5f10-49f4-9675-231112e96867 service nova] Lock "309c9c26-4a0f-45db-bb3a-595b19f3f627-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.compute.manager [req-3084a275-9b54-4411-a6a8-135b42edd3be req-0a22029e-5f10-49f4-9675-231112e96867 service nova] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] No waiting events found dispatching network-vif-plugged-626b287e-22ee-49d7-8ec3-c1a0660bd5d8 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:30:57 user nova-compute[70374]: WARNING nova.compute.manager [req-3084a275-9b54-4411-a6a8-135b42edd3be req-0a22029e-5f10-49f4-9675-231112e96867 service nova] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Received unexpected event network-vif-plugged-626b287e-22ee-49d7-8ec3-c1a0660bd5d8 for instance with vm_state building and task_state spawning. Jul 27 09:30:57 user nova-compute[70374]: INFO nova.compute.manager [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Took 8.81 seconds to spawn the instance on the hypervisor. Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.compute.manager [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:30:57 user nova-compute[70374]: INFO nova.compute.manager [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Took 11.27 seconds to build instance. Jul 27 09:30:57 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-8efbf7c5-01d1-45b7-8dfb-fe367e033528 tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Lock "d35fe056-8279-479a-a673-6c61e5ec6933" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 11.390s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.compute.manager [req-3f9093eb-7a98-45d4-9b54-e83a92284eca req-7c93341f-731d-4fb1-ab3f-2a3396c8f2ad service nova] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Received event network-changed-6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.compute.manager [req-3f9093eb-7a98-45d4-9b54-e83a92284eca req-7c93341f-731d-4fb1-ab3f-2a3396c8f2ad service nova] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Refreshing instance network info cache due to event network-changed-6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f. {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-3f9093eb-7a98-45d4-9b54-e83a92284eca req-7c93341f-731d-4fb1-ab3f-2a3396c8f2ad service nova] Acquiring lock "refresh_cache-b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.network.neutron [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Updating instance_info_cache with network_info: [{"id": "6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f", "address": "fa:16:3e:b2:9d:c3", "network": {"id": "158fe9d2-5b60-4b57-bdcb-10de0604f194", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-1646104715-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "df3e52a41c1847b199e6dcd09b676fba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6bd9ecfd-f7", "ovs_interfaceid": "6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Releasing lock "refresh_cache-b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.compute.manager [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Instance network_info: |[{"id": "6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f", "address": "fa:16:3e:b2:9d:c3", "network": {"id": "158fe9d2-5b60-4b57-bdcb-10de0604f194", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-1646104715-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "df3e52a41c1847b199e6dcd09b676fba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6bd9ecfd-f7", "ovs_interfaceid": "6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70374) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-3f9093eb-7a98-45d4-9b54-e83a92284eca req-7c93341f-731d-4fb1-ab3f-2a3396c8f2ad service nova] Acquired lock "refresh_cache-b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.network.neutron [req-3f9093eb-7a98-45d4-9b54-e83a92284eca req-7c93341f-731d-4fb1-ab3f-2a3396c8f2ad service nova] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Refreshing network info cache for port 6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Start _get_guest_xml network_info=[{"id": "6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f", "address": "fa:16:3e:b2:9d:c3", "network": {"id": "158fe9d2-5b60-4b57-bdcb-10de0604f194", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-1646104715-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "df3e52a41c1847b199e6dcd09b676fba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6bd9ecfd-f7", "ovs_interfaceid": "6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'ide', 'dev': 'hda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-07-27T09:26:27Z,direct_url=,disk_format='qcow2',id=35458adf-261a-4e0b-a4db-b243619b2394,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='37aff788807d4b2aafc10d355583f7c7',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-07-27T09:26:29Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'size': 0, 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'image_id': '35458adf-261a-4e0b-a4db-b243619b2394'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70374) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Jul 27 09:30:57 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:30:57 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Searching host: 'user' for CPU controller through CGroups V1... {{(pid=70374) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] CPU controller missing on host. {{(pid=70374) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Searching host: 'user' for CPU controller through CGroups V2... {{(pid=70374) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] CPU controller found on host. {{(pid=70374) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70374) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-07-27T09:26:16Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1',id=6,is_public=True,memory_mb=512,name='m1.tiny',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-07-27T09:26:27Z,direct_url=,disk_format='qcow2',id=35458adf-261a-4e0b-a4db-b243619b2394,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='37aff788807d4b2aafc10d355583f7c7',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-07-27T09:26:29Z,virtual_size=,visibility=), allow threads: True {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Flavor limits 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Image limits 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Flavor pref 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Image pref 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70374) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Got 1 possible topologies {{(pid=70374) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.virt.libvirt.vif [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2023-07-27T09:30:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-175106609',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(6),hidden=False,host='user',hostname='tempest-device-tagging-server-175106609',id=11,image_ref='35458adf-261a-4e0b-a4db-b243619b2394',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOTnqzjorB039ilJl8VK7YcSm6BfDbMAMeMlN6KuNuE2BR5Ci7E8V9F4eHRzTVNR5ErMNNWpJQddk0yLJsVN++T9e5hUTfZ9niEsaZXI1P72KyNTxQBI7EWBQg10Ylojmg==',key_name='tempest-keypair-693359237',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='df3e52a41c1847b199e6dcd09b676fba',ramdisk_id='',reservation_id='r-tordp007',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='35458adf-261a-4e0b-a4db-b243619b2394',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-TaggedAttachmentsTest-493274998',owner_user_name='tempest-TaggedAttachmentsTest-493274998-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-07-27T09:30:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6d33f8cd041046c18af25f56b63b6bb5',uuid=b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f", "address": "fa:16:3e:b2:9d:c3", "network": {"id": "158fe9d2-5b60-4b57-bdcb-10de0604f194", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-1646104715-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "df3e52a41c1847b199e6dcd09b676fba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6bd9ecfd-f7", "ovs_interfaceid": "6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70374) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Converting VIF {"id": "6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f", "address": "fa:16:3e:b2:9d:c3", "network": {"id": "158fe9d2-5b60-4b57-bdcb-10de0604f194", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-1646104715-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "df3e52a41c1847b199e6dcd09b676fba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6bd9ecfd-f7", "ovs_interfaceid": "6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b2:9d:c3,bridge_name='br-int',has_traffic_filtering=True,id=6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f,network=Network(158fe9d2-5b60-4b57-bdcb-10de0604f194),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6bd9ecfd-f7') {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.objects.instance [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Lazy-loading 'pci_devices' on Instance uuid b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] End _get_guest_xml xml= Jul 27 09:30:57 user nova-compute[70374]: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06 Jul 27 09:30:57 user nova-compute[70374]: instance-0000000b Jul 27 09:30:57 user nova-compute[70374]: 524288 Jul 27 09:30:57 user nova-compute[70374]: 1 Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: tempest-device-tagging-server-175106609 Jul 27 09:30:57 user nova-compute[70374]: 2023-07-27 09:30:57 Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: 512 Jul 27 09:30:57 user nova-compute[70374]: 1 Jul 27 09:30:57 user nova-compute[70374]: 0 Jul 27 09:30:57 user nova-compute[70374]: 0 Jul 27 09:30:57 user nova-compute[70374]: 1 Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: tempest-TaggedAttachmentsTest-493274998-project-member Jul 27 09:30:57 user nova-compute[70374]: tempest-TaggedAttachmentsTest-493274998 Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: OpenStack Foundation Jul 27 09:30:57 user nova-compute[70374]: OpenStack Nova Jul 27 09:30:57 user nova-compute[70374]: 0.0.0 Jul 27 09:30:57 user nova-compute[70374]: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06 Jul 27 09:30:57 user nova-compute[70374]: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06 Jul 27 09:30:57 user nova-compute[70374]: Virtual Machine Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: hvm Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Nehalem Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: /dev/urandom Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: Jul 27 09:30:57 user nova-compute[70374]: {{(pid=70374) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.virt.libvirt.vif [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2023-07-27T09:30:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-175106609',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(6),hidden=False,host='user',hostname='tempest-device-tagging-server-175106609',id=11,image_ref='35458adf-261a-4e0b-a4db-b243619b2394',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOTnqzjorB039ilJl8VK7YcSm6BfDbMAMeMlN6KuNuE2BR5Ci7E8V9F4eHRzTVNR5ErMNNWpJQddk0yLJsVN++T9e5hUTfZ9niEsaZXI1P72KyNTxQBI7EWBQg10Ylojmg==',key_name='tempest-keypair-693359237',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='df3e52a41c1847b199e6dcd09b676fba',ramdisk_id='',reservation_id='r-tordp007',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='35458adf-261a-4e0b-a4db-b243619b2394',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-TaggedAttachmentsTest-493274998',owner_user_name='tempest-TaggedAttachmentsTest-493274998-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-07-27T09:30:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6d33f8cd041046c18af25f56b63b6bb5',uuid=b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f", "address": "fa:16:3e:b2:9d:c3", "network": {"id": "158fe9d2-5b60-4b57-bdcb-10de0604f194", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-1646104715-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "df3e52a41c1847b199e6dcd09b676fba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6bd9ecfd-f7", "ovs_interfaceid": "6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Converting VIF {"id": "6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f", "address": "fa:16:3e:b2:9d:c3", "network": {"id": "158fe9d2-5b60-4b57-bdcb-10de0604f194", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-1646104715-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "df3e52a41c1847b199e6dcd09b676fba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6bd9ecfd-f7", "ovs_interfaceid": "6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b2:9d:c3,bridge_name='br-int',has_traffic_filtering=True,id=6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f,network=Network(158fe9d2-5b60-4b57-bdcb-10de0604f194),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6bd9ecfd-f7') {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG os_vif [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:9d:c3,bridge_name='br-int',has_traffic_filtering=True,id=6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f,network=Network(158fe9d2-5b60-4b57-bdcb-10de0604f194),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6bd9ecfd-f7') {{(pid=70374) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6bd9ecfd-f7, may_exist=True, interface_attrs={}) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6bd9ecfd-f7, col_values=(('external_ids', {'iface-id': '6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b2:9d:c3', 'vm-uuid': 'b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06'}),), if_exists=True) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Jul 27 09:30:57 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:57 user nova-compute[70374]: INFO os_vif [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:9d:c3,bridge_name='br-int',has_traffic_filtering=True,id=6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f,network=Network(158fe9d2-5b60-4b57-bdcb-10de0604f194),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6bd9ecfd-f7') Jul 27 09:30:58 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] No BDM found with device name vda, not building metadata. {{(pid=70374) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Jul 27 09:30:58 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] No BDM found with device name hda, not building metadata. {{(pid=70374) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Jul 27 09:30:58 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] No VIF found with MAC fa:16:3e:b2:9d:c3, not building metadata {{(pid=70374) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Jul 27 09:30:58 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Using config drive Jul 27 09:30:58 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Creating config drive at /opt/stack/data/nova/instances/b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06/disk.config Jul 27 09:30:58 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Running cmd (subprocess): genisoimage -o /opt/stack/data/nova/instances/b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 0.0.0 -quiet -J -r -V config-2 /tmp/tmpz_wytjg6 {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:30:58 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] CMD "genisoimage -o /opt/stack/data/nova/instances/b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 0.0.0 -quiet -J -r -V config-2 /tmp/tmpz_wytjg6" returned: 0 in 0.084s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:30:58 user nova-compute[70374]: DEBUG nova.network.neutron [req-3f9093eb-7a98-45d4-9b54-e83a92284eca req-7c93341f-731d-4fb1-ab3f-2a3396c8f2ad service nova] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Updated VIF entry in instance network info cache for port 6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f. {{(pid=70374) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Jul 27 09:30:58 user nova-compute[70374]: DEBUG nova.network.neutron [req-3f9093eb-7a98-45d4-9b54-e83a92284eca req-7c93341f-731d-4fb1-ab3f-2a3396c8f2ad service nova] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Updating instance_info_cache with network_info: [{"id": "6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f", "address": "fa:16:3e:b2:9d:c3", "network": {"id": "158fe9d2-5b60-4b57-bdcb-10de0604f194", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-1646104715-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "df3e52a41c1847b199e6dcd09b676fba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6bd9ecfd-f7", "ovs_interfaceid": "6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:30:58 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-3f9093eb-7a98-45d4-9b54-e83a92284eca req-7c93341f-731d-4fb1-ab3f-2a3396c8f2ad service nova] Releasing lock "refresh_cache-b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:30:58 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:30:59 user nova-compute[70374]: DEBUG nova.virt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Emitting event Resumed> {{(pid=70374) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Jul 27 09:30:59 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] VM Resumed (Lifecycle Event) Jul 27 09:30:59 user nova-compute[70374]: DEBUG nova.compute.manager [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Instance event wait completed in 0 seconds for {{(pid=70374) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Jul 27 09:30:59 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Guest created on hypervisor {{(pid=70374) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4411}} Jul 27 09:30:59 user nova-compute[70374]: INFO nova.virt.libvirt.driver [-] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Instance spawned successfully. Jul 27 09:30:59 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Attempting to register defaults for the following image properties: ['hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:894}} Jul 27 09:30:59 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:30:59 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Found default for hw_input_bus of None {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:59 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Found default for hw_pointer_model of None {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:59 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Found default for hw_video_model of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:59 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Found default for hw_vif_model of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:59 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70374) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Jul 27 09:30:59 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] During sync_power_state the instance has a pending task (spawning). Skip. Jul 27 09:30:59 user nova-compute[70374]: DEBUG nova.virt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Emitting event Started> {{(pid=70374) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Jul 27 09:30:59 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] VM Started (Lifecycle Event) Jul 27 09:30:59 user nova-compute[70374]: DEBUG nova.compute.manager [req-876278b4-c511-47b5-9f6e-7897f7b82dcd req-6232c384-7514-40df-88f9-bb1783354333 service nova] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Received event network-vif-plugged-c8fbf870-e2d7-4ec6-be17-98f3d9315f75 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:30:59 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-876278b4-c511-47b5-9f6e-7897f7b82dcd req-6232c384-7514-40df-88f9-bb1783354333 service nova] Acquiring lock "773917c6-56d7-4491-a760-05f51593b7f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:30:59 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-876278b4-c511-47b5-9f6e-7897f7b82dcd req-6232c384-7514-40df-88f9-bb1783354333 service nova] Lock "773917c6-56d7-4491-a760-05f51593b7f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.003s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:30:59 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-876278b4-c511-47b5-9f6e-7897f7b82dcd req-6232c384-7514-40df-88f9-bb1783354333 service nova] Lock "773917c6-56d7-4491-a760-05f51593b7f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:59 user nova-compute[70374]: DEBUG nova.compute.manager [req-876278b4-c511-47b5-9f6e-7897f7b82dcd req-6232c384-7514-40df-88f9-bb1783354333 service nova] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] No waiting events found dispatching network-vif-plugged-c8fbf870-e2d7-4ec6-be17-98f3d9315f75 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:30:59 user nova-compute[70374]: WARNING nova.compute.manager [req-876278b4-c511-47b5-9f6e-7897f7b82dcd req-6232c384-7514-40df-88f9-bb1783354333 service nova] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Received unexpected event network-vif-plugged-c8fbf870-e2d7-4ec6-be17-98f3d9315f75 for instance with vm_state building and task_state spawning. Jul 27 09:30:59 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:30:59 user nova-compute[70374]: INFO nova.compute.manager [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Took 12.18 seconds to spawn the instance on the hypervisor. Jul 27 09:30:59 user nova-compute[70374]: DEBUG nova.compute.manager [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:30:59 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70374) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Jul 27 09:30:59 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] During sync_power_state the instance has a pending task (spawning). Skip. Jul 27 09:30:59 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:30:59 user nova-compute[70374]: DEBUG nova.compute.manager [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Cleaning up deleted instances {{(pid=70374) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11129}} Jul 27 09:30:59 user nova-compute[70374]: DEBUG nova.compute.manager [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] There are 0 instances to clean {{(pid=70374) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11138}} Jul 27 09:30:59 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:30:59 user nova-compute[70374]: DEBUG nova.compute.manager [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Cleaning up deleted instances with incomplete migration {{(pid=70374) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11167}} Jul 27 09:30:59 user nova-compute[70374]: INFO nova.compute.manager [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Took 14.20 seconds to build instance. Jul 27 09:30:59 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:30:59 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-19982597-4a94-4f10-8524-9853250946dc tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Lock "309c9c26-4a0f-45db-bb3a-595b19f3f627" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.511s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:30:59 user nova-compute[70374]: DEBUG nova.compute.manager [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Instance event wait completed in 0 seconds for {{(pid=70374) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Jul 27 09:30:59 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Guest created on hypervisor {{(pid=70374) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4411}} Jul 27 09:30:59 user nova-compute[70374]: DEBUG nova.virt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Emitting event Resumed> {{(pid=70374) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Jul 27 09:30:59 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] VM Resumed (Lifecycle Event) Jul 27 09:30:59 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:30:59 user nova-compute[70374]: INFO nova.virt.libvirt.driver [-] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Instance spawned successfully. Jul 27 09:30:59 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:894}} Jul 27 09:30:59 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70374) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Jul 27 09:30:59 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] During sync_power_state the instance has a pending task (spawning). Skip. Jul 27 09:30:59 user nova-compute[70374]: DEBUG nova.virt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Emitting event Started> {{(pid=70374) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Jul 27 09:30:59 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] VM Started (Lifecycle Event) Jul 27 09:30:59 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Found default for hw_cdrom_bus of ide {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:59 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Found default for hw_disk_bus of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:59 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Found default for hw_input_bus of None {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:59 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Found default for hw_pointer_model of None {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:59 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Found default for hw_video_model of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:59 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Found default for hw_vif_model of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:30:59 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:30:59 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70374) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Jul 27 09:30:59 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] During sync_power_state the instance has a pending task (spawning). Skip. Jul 27 09:31:00 user nova-compute[70374]: INFO nova.compute.manager [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Took 10.30 seconds to spawn the instance on the hypervisor. Jul 27 09:31:00 user nova-compute[70374]: DEBUG nova.compute.manager [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:31:00 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:00 user nova-compute[70374]: INFO nova.compute.manager [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Took 13.61 seconds to build instance. Jul 27 09:31:00 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:00 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:00 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:00 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-393b9621-444f-44f7-9d2d-5e5143fb26b6 tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Lock "773917c6-56d7-4491-a760-05f51593b7f0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.121s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:31:01 user nova-compute[70374]: DEBUG nova.compute.manager [req-ad96762b-e06e-43e9-9c8d-8e768e7eea97 req-656a178a-6b21-4df4-bc9d-13fded490b29 service nova] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Received event network-vif-plugged-6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:31:01 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-ad96762b-e06e-43e9-9c8d-8e768e7eea97 req-656a178a-6b21-4df4-bc9d-13fded490b29 service nova] Acquiring lock "b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:31:01 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-ad96762b-e06e-43e9-9c8d-8e768e7eea97 req-656a178a-6b21-4df4-bc9d-13fded490b29 service nova] Lock "b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:31:01 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-ad96762b-e06e-43e9-9c8d-8e768e7eea97 req-656a178a-6b21-4df4-bc9d-13fded490b29 service nova] Lock "b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:31:01 user nova-compute[70374]: DEBUG nova.compute.manager [req-ad96762b-e06e-43e9-9c8d-8e768e7eea97 req-656a178a-6b21-4df4-bc9d-13fded490b29 service nova] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] No waiting events found dispatching network-vif-plugged-6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:31:01 user nova-compute[70374]: WARNING nova.compute.manager [req-ad96762b-e06e-43e9-9c8d-8e768e7eea97 req-656a178a-6b21-4df4-bc9d-13fded490b29 service nova] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Received unexpected event network-vif-plugged-6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f for instance with vm_state building and task_state spawning. Jul 27 09:31:01 user nova-compute[70374]: DEBUG nova.compute.manager [req-ad96762b-e06e-43e9-9c8d-8e768e7eea97 req-656a178a-6b21-4df4-bc9d-13fded490b29 service nova] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Received event network-vif-plugged-6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:31:01 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-ad96762b-e06e-43e9-9c8d-8e768e7eea97 req-656a178a-6b21-4df4-bc9d-13fded490b29 service nova] Acquiring lock "b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:31:01 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-ad96762b-e06e-43e9-9c8d-8e768e7eea97 req-656a178a-6b21-4df4-bc9d-13fded490b29 service nova] Lock "b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:31:01 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-ad96762b-e06e-43e9-9c8d-8e768e7eea97 req-656a178a-6b21-4df4-bc9d-13fded490b29 service nova] Lock "b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:31:01 user nova-compute[70374]: DEBUG nova.compute.manager [req-ad96762b-e06e-43e9-9c8d-8e768e7eea97 req-656a178a-6b21-4df4-bc9d-13fded490b29 service nova] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] No waiting events found dispatching network-vif-plugged-6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:31:01 user nova-compute[70374]: WARNING nova.compute.manager [req-ad96762b-e06e-43e9-9c8d-8e768e7eea97 req-656a178a-6b21-4df4-bc9d-13fded490b29 service nova] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Received unexpected event network-vif-plugged-6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f for instance with vm_state building and task_state spawning. Jul 27 09:31:01 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:01 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:31:01 user nova-compute[70374]: DEBUG nova.compute.manager [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Starting heal instance info cache {{(pid=70374) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9842}} Jul 27 09:31:01 user nova-compute[70374]: DEBUG nova.compute.manager [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Rebuilding the list of instances to heal {{(pid=70374) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9846}} Jul 27 09:31:01 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:01 user nova-compute[70374]: DEBUG nova.compute.manager [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Skipping network cache update for instance because it is Building. {{(pid=70374) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9855}} Jul 27 09:31:01 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Acquiring lock "refresh_cache-04a990b9-ed32-41e9-b384-f0886e8d1b49" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:31:01 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Acquired lock "refresh_cache-04a990b9-ed32-41e9-b384-f0886e8d1b49" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:31:01 user nova-compute[70374]: DEBUG nova.network.neutron [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Forcefully refreshing network info cache for instance {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1996}} Jul 27 09:31:01 user nova-compute[70374]: DEBUG nova.objects.instance [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Lazy-loading 'info_cache' on Instance uuid 04a990b9-ed32-41e9-b384-f0886e8d1b49 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:31:02 user nova-compute[70374]: DEBUG nova.network.neutron [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Updating instance_info_cache with network_info: [{"id": "6858719a-7fa4-4a64-ad8a-02c9274adb55", "address": "fa:16:3e:e6:e0:42", "network": {"id": "e1a0003b-0f63-4efa-9d24-933f88eb0373", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-651468270-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "65e935bb36794aa98eb3e94c30e647d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6858719a-7f", "ovs_interfaceid": "6858719a-7fa4-4a64-ad8a-02c9274adb55", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:31:02 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Releasing lock "refresh_cache-04a990b9-ed32-41e9-b384-f0886e8d1b49" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:31:02 user nova-compute[70374]: DEBUG nova.compute.manager [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] [instance: 04a990b9-ed32-41e9-b384-f0886e8d1b49] Updated the network info_cache for instance {{(pid=70374) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} Jul 27 09:31:02 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:31:02 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:31:02 user nova-compute[70374]: DEBUG nova.compute.manager [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70374) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10461}} Jul 27 09:31:02 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager.update_available_resource {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:31:02 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:31:02 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:31:02 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:31:02 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Auditing locally available compute resources for user (node: user) {{(pid=70374) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} Jul 27 09:31:02 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:02 user nova-compute[70374]: DEBUG nova.virt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Emitting event Resumed> {{(pid=70374) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Jul 27 09:31:02 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] VM Resumed (Lifecycle Event) Jul 27 09:31:02 user nova-compute[70374]: DEBUG nova.compute.manager [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Instance event wait completed in 0 seconds for {{(pid=70374) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Jul 27 09:31:02 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Guest created on hypervisor {{(pid=70374) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4411}} Jul 27 09:31:03 user nova-compute[70374]: INFO nova.virt.libvirt.driver [-] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Instance spawned successfully. Jul 27 09:31:03 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:894}} Jul 27 09:31:03 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:31:03 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70374) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Jul 27 09:31:03 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Found default for hw_cdrom_bus of ide {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:31:03 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Found default for hw_disk_bus of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:31:03 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Found default for hw_input_bus of None {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:31:03 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Found default for hw_pointer_model of None {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:31:03 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Found default for hw_video_model of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:31:03 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Found default for hw_vif_model of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:31:03 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] During sync_power_state the instance has a pending task (spawning). Skip. Jul 27 09:31:03 user nova-compute[70374]: DEBUG nova.virt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Emitting event Started> {{(pid=70374) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Jul 27 09:31:03 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] VM Started (Lifecycle Event) Jul 27 09:31:03 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:31:03 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70374) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Jul 27 09:31:03 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] During sync_power_state the instance has a pending task (spawning). Skip. Jul 27 09:31:03 user nova-compute[70374]: INFO nova.compute.manager [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Took 9.20 seconds to spawn the instance on the hypervisor. Jul 27 09:31:03 user nova-compute[70374]: DEBUG nova.compute.manager [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:31:03 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/773917c6-56d7-4491-a760-05f51593b7f0/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:31:03 user nova-compute[70374]: INFO nova.compute.manager [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Took 10.62 seconds to build instance. Jul 27 09:31:03 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-22a0c6f6-9a97-473e-b18a-9708c1273e24 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Lock "b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.778s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:31:03 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/773917c6-56d7-4491-a760-05f51593b7f0/disk --force-share --output=json" returned: 0 in 0.180s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:31:03 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/773917c6-56d7-4491-a760-05f51593b7f0/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:31:03 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/773917c6-56d7-4491-a760-05f51593b7f0/disk --force-share --output=json" returned: 0 in 0.161s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:31:03 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6ae93f34-ce7e-4ae4-a5ba-36508361bd54/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:31:03 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6ae93f34-ce7e-4ae4-a5ba-36508361bd54/disk --force-share --output=json" returned: 0 in 0.162s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:31:03 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6ae93f34-ce7e-4ae4-a5ba-36508361bd54/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:31:03 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:04 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6ae93f34-ce7e-4ae4-a5ba-36508361bd54/disk --force-share --output=json" returned: 0 in 0.180s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:31:04 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d35fe056-8279-479a-a673-6c61e5ec6933/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:31:04 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d35fe056-8279-479a-a673-6c61e5ec6933/disk --force-share --output=json" returned: 0 in 0.214s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:31:04 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d35fe056-8279-479a-a673-6c61e5ec6933/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:31:04 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d35fe056-8279-479a-a673-6c61e5ec6933/disk --force-share --output=json" returned: 0 in 0.220s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:31:04 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/04a990b9-ed32-41e9-b384-f0886e8d1b49/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:31:04 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/04a990b9-ed32-41e9-b384-f0886e8d1b49/disk --force-share --output=json" returned: 0 in 0.255s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:31:04 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/04a990b9-ed32-41e9-b384-f0886e8d1b49/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:31:04 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/04a990b9-ed32-41e9-b384-f0886e8d1b49/disk --force-share --output=json" returned: 0 in 0.206s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:31:04 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fb5ccac9-1e45-4726-b681-cf34cf3fa521/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:31:05 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fb5ccac9-1e45-4726-b681-cf34cf3fa521/disk --force-share --output=json" returned: 0 in 0.193s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:31:05 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fb5ccac9-1e45-4726-b681-cf34cf3fa521/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:31:05 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fb5ccac9-1e45-4726-b681-cf34cf3fa521/disk --force-share --output=json" returned: 0 in 0.231s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:31:05 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a5593cd-3ab6-4859-b0fb-33a0ed702dc8/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:31:05 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a5593cd-3ab6-4859-b0fb-33a0ed702dc8/disk --force-share --output=json" returned: 0 in 0.178s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:31:05 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a5593cd-3ab6-4859-b0fb-33a0ed702dc8/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:31:05 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a5593cd-3ab6-4859-b0fb-33a0ed702dc8/disk --force-share --output=json" returned: 0 in 0.184s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:31:05 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/42f4c546-47e4-485b-be29-4081c7557bad/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:31:06 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/42f4c546-47e4-485b-be29-4081c7557bad/disk --force-share --output=json" returned: 0 in 0.181s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:31:06 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/42f4c546-47e4-485b-be29-4081c7557bad/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:31:06 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/42f4c546-47e4-485b-be29-4081c7557bad/disk --force-share --output=json" returned: 0 in 0.160s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:31:06 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/25214e8a-c626-46a7-b273-eb491c2fc91b/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:31:06 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/25214e8a-c626-46a7-b273-eb491c2fc91b/disk --force-share --output=json" returned: 0 in 0.182s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:31:06 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/25214e8a-c626-46a7-b273-eb491c2fc91b/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:31:06 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/25214e8a-c626-46a7-b273-eb491c2fc91b/disk --force-share --output=json" returned: 0 in 0.203s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:31:06 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8640c525-e6ba-4bf8-9fe0-2c08155dd1cb/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:31:06 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8640c525-e6ba-4bf8-9fe0-2c08155dd1cb/disk --force-share --output=json" returned: 0 in 0.180s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:31:06 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8640c525-e6ba-4bf8-9fe0-2c08155dd1cb/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:31:07 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8640c525-e6ba-4bf8-9fe0-2c08155dd1cb/disk --force-share --output=json" returned: 0 in 0.216s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:31:07 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/309c9c26-4a0f-45db-bb3a-595b19f3f627/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:31:07 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/309c9c26-4a0f-45db-bb3a-595b19f3f627/disk --force-share --output=json" returned: 0 in 0.202s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:31:07 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/309c9c26-4a0f-45db-bb3a-595b19f3f627/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:31:07 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/309c9c26-4a0f-45db-bb3a-595b19f3f627/disk --force-share --output=json" returned: 0 in 0.162s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:31:07 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:31:07 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06/disk --force-share --output=json" returned: 0 in 0.171s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:31:07 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:31:07 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06/disk --force-share --output=json" returned: 0 in 0.167s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:31:08 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:08 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:31:08 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:31:08 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Hypervisor/Node resource view: name=user free_ram=7304MB free_disk=25.77625274658203GB free_vcpus=1 pci_devices=[{"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70374) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1080}} Jul 27 09:31:08 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:31:08 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.002s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:31:08 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Instance 04a990b9-ed32-41e9-b384-f0886e8d1b49 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 512, 'VCPU': 1}}. {{(pid=70374) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1678}} Jul 27 09:31:08 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Instance fb5ccac9-1e45-4726-b681-cf34cf3fa521 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 512, 'VCPU': 1}}. {{(pid=70374) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1678}} Jul 27 09:31:08 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Instance 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 512, 'VCPU': 1}}. {{(pid=70374) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1678}} Jul 27 09:31:08 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Instance 6ae93f34-ce7e-4ae4-a5ba-36508361bd54 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 512, 'VCPU': 1}}. {{(pid=70374) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1678}} Jul 27 09:31:08 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Instance 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 512, 'VCPU': 1}}. {{(pid=70374) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1678}} Jul 27 09:31:08 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Instance 25214e8a-c626-46a7-b273-eb491c2fc91b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 512, 'VCPU': 1}}. {{(pid=70374) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1678}} Jul 27 09:31:08 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Instance 42f4c546-47e4-485b-be29-4081c7557bad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 512, 'VCPU': 1}}. {{(pid=70374) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1678}} Jul 27 09:31:08 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Instance 309c9c26-4a0f-45db-bb3a-595b19f3f627 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 512, 'VCPU': 1}}. {{(pid=70374) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1678}} Jul 27 09:31:08 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Instance d35fe056-8279-479a-a673-6c61e5ec6933 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 512, 'VCPU': 1}}. {{(pid=70374) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1678}} Jul 27 09:31:08 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Instance 773917c6-56d7-4491-a760-05f51593b7f0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 512, 'VCPU': 1}}. {{(pid=70374) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1678}} Jul 27 09:31:08 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Instance b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 512, 'VCPU': 1}}. {{(pid=70374) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1678}} Jul 27 09:31:08 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Total usable vcpus: 12, total allocated vcpus: 11 {{(pid=70374) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1103}} Jul 27 09:31:08 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Final resource view: name=user phys_ram=16011MB used_ram=6144MB phys_disk=40GB used_disk=11GB total_vcpus=12 used_vcpus=11 pci_stats=[] {{(pid=70374) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1112}} Jul 27 09:31:08 user nova-compute[70374]: DEBUG nova.scheduler.client.report [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Refreshing inventories for resource provider f7548644-4a09-4ad8-9aa6-6e05d85a9f5b {{(pid=70374) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Jul 27 09:31:08 user nova-compute[70374]: DEBUG nova.scheduler.client.report [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Updating ProviderTree inventory for provider f7548644-4a09-4ad8-9aa6-6e05d85a9f5b from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16011, 'reserved': 512, 'min_unit': 1, 'max_unit': 16011, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70374) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Jul 27 09:31:08 user nova-compute[70374]: DEBUG nova.compute.provider_tree [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Updating inventory in ProviderTree for provider f7548644-4a09-4ad8-9aa6-6e05d85a9f5b with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16011, 'reserved': 512, 'min_unit': 1, 'max_unit': 16011, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70374) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Jul 27 09:31:08 user nova-compute[70374]: DEBUG nova.scheduler.client.report [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Refreshing aggregate associations for resource provider f7548644-4a09-4ad8-9aa6-6e05d85a9f5b, aggregates: None {{(pid=70374) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Jul 27 09:31:08 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:08 user nova-compute[70374]: DEBUG nova.scheduler.client.report [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Refreshing trait associations for resource provider f7548644-4a09-4ad8-9aa6-6e05d85a9f5b, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_TRUSTED_CERTS,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE2 {{(pid=70374) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Jul 27 09:31:09 user nova-compute[70374]: DEBUG nova.compute.provider_tree [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Inventory has not changed in ProviderTree for provider: f7548644-4a09-4ad8-9aa6-6e05d85a9f5b {{(pid=70374) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Jul 27 09:31:09 user nova-compute[70374]: DEBUG nova.scheduler.client.report [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Inventory has not changed for provider f7548644-4a09-4ad8-9aa6-6e05d85a9f5b based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16011, 'reserved': 512, 'min_unit': 1, 'max_unit': 16011, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70374) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Jul 27 09:31:09 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Compute_service record updated for user:user {{(pid=70374) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1041}} Jul 27 09:31:09 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.012s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:31:10 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:31:10 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:31:10 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:31:10 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:31:10 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:31:13 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:13 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:18 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:18 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:23 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:23 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:28 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:29 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:33 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:34 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:38 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:40 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:43 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:43 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:44 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:47 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:48 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:49 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:49 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:49 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:52 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Acquiring lock "ea21f129-1e0a-4ad1-b516-c2a8f92835b7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:31:52 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Lock "ea21f129-1e0a-4ad1-b516-c2a8f92835b7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:31:52 user nova-compute[70374]: DEBUG nova.compute.manager [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Starting instance... {{(pid=70374) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Jul 27 09:31:52 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:31:52 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:31:52 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70374) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Jul 27 09:31:52 user nova-compute[70374]: INFO nova.compute.claims [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Claim successful on node user Jul 27 09:31:52 user nova-compute[70374]: DEBUG nova.compute.provider_tree [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Inventory has not changed in ProviderTree for provider: f7548644-4a09-4ad8-9aa6-6e05d85a9f5b {{(pid=70374) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Jul 27 09:31:52 user nova-compute[70374]: DEBUG nova.scheduler.client.report [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Inventory has not changed for provider f7548644-4a09-4ad8-9aa6-6e05d85a9f5b based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16011, 'reserved': 512, 'min_unit': 1, 'max_unit': 16011, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70374) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Jul 27 09:31:52 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.466s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:31:52 user nova-compute[70374]: DEBUG nova.compute.manager [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Start building networks asynchronously for instance. {{(pid=70374) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Jul 27 09:31:52 user nova-compute[70374]: DEBUG nova.compute.manager [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Allocating IP information in the background. {{(pid=70374) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} Jul 27 09:31:52 user nova-compute[70374]: DEBUG nova.network.neutron [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] allocate_for_instance() {{(pid=70374) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Jul 27 09:31:52 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Jul 27 09:31:52 user nova-compute[70374]: DEBUG nova.compute.manager [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Start building block device mappings for instance. {{(pid=70374) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Jul 27 09:31:52 user nova-compute[70374]: DEBUG nova.policy [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fa672f231aae45d6a68bddddf583dcde', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3b6c645430b147a0983e499772a43790', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70374) authorize /opt/stack/nova/nova/policy.py:203}} Jul 27 09:31:52 user nova-compute[70374]: DEBUG nova.compute.manager [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Start spawning the instance on the hypervisor. {{(pid=70374) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Jul 27 09:31:52 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Creating instance directory {{(pid=70374) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Jul 27 09:31:52 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Creating image(s) Jul 27 09:31:52 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Acquiring lock "/opt/stack/data/nova/instances/ea21f129-1e0a-4ad1-b516-c2a8f92835b7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:31:52 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Lock "/opt/stack/data/nova/instances/ea21f129-1e0a-4ad1-b516-c2a8f92835b7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:31:52 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Lock "/opt/stack/data/nova/instances/ea21f129-1e0a-4ad1-b516-c2a8f92835b7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:31:52 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:31:53 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.129s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:31:53 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Acquiring lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:31:53 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:31:53 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:31:53 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.139s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:31:53 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1,backing_fmt=raw /opt/stack/data/nova/instances/ea21f129-1e0a-4ad1-b516-c2a8f92835b7/disk 1073741824 {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:31:53 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1,backing_fmt=raw /opt/stack/data/nova/instances/ea21f129-1e0a-4ad1-b516-c2a8f92835b7/disk 1073741824" returned: 0 in 0.046s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:31:53 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.192s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:31:53 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:31:53 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:53 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.137s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:31:53 user nova-compute[70374]: DEBUG nova.virt.disk.api [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Checking if we can resize image /opt/stack/data/nova/instances/ea21f129-1e0a-4ad1-b516-c2a8f92835b7/disk. size=1073741824 {{(pid=70374) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Jul 27 09:31:53 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ea21f129-1e0a-4ad1-b516-c2a8f92835b7/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:31:53 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ea21f129-1e0a-4ad1-b516-c2a8f92835b7/disk --force-share --output=json" returned: 0 in 0.153s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:31:53 user nova-compute[70374]: DEBUG nova.virt.disk.api [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Cannot resize image /opt/stack/data/nova/instances/ea21f129-1e0a-4ad1-b516-c2a8f92835b7/disk to a smaller size. {{(pid=70374) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Jul 27 09:31:53 user nova-compute[70374]: DEBUG nova.objects.instance [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Lazy-loading 'migration_context' on Instance uuid ea21f129-1e0a-4ad1-b516-c2a8f92835b7 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:31:53 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Created local disks {{(pid=70374) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Jul 27 09:31:53 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Ensure instance console log exists: /opt/stack/data/nova/instances/ea21f129-1e0a-4ad1-b516-c2a8f92835b7/console.log {{(pid=70374) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Jul 27 09:31:53 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:31:53 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:31:53 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:31:53 user nova-compute[70374]: DEBUG nova.network.neutron [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Successfully created port: faecacb1-2e14-4d38-8c90-bfb3ff5e23a7 {{(pid=70374) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG nova.network.neutron [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Successfully updated port: faecacb1-2e14-4d38-8c90-bfb3ff5e23a7 {{(pid=70374) _update_port /opt/stack/nova/nova/network/neutron.py:586}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Acquiring lock "refresh_cache-ea21f129-1e0a-4ad1-b516-c2a8f92835b7" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Acquired lock "refresh_cache-ea21f129-1e0a-4ad1-b516-c2a8f92835b7" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG nova.network.neutron [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Building network info cache for instance {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG nova.compute.manager [req-400c5804-cd73-4545-85c5-94fc45c9bea3 req-90afc4ed-8b83-469b-b355-43b8f07de3dd service nova] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Received event network-changed-faecacb1-2e14-4d38-8c90-bfb3ff5e23a7 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG nova.compute.manager [req-400c5804-cd73-4545-85c5-94fc45c9bea3 req-90afc4ed-8b83-469b-b355-43b8f07de3dd service nova] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Refreshing instance network info cache due to event network-changed-faecacb1-2e14-4d38-8c90-bfb3ff5e23a7. {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-400c5804-cd73-4545-85c5-94fc45c9bea3 req-90afc4ed-8b83-469b-b355-43b8f07de3dd service nova] Acquiring lock "refresh_cache-ea21f129-1e0a-4ad1-b516-c2a8f92835b7" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG nova.network.neutron [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Instance cache missing network info. {{(pid=70374) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG nova.network.neutron [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Updating instance_info_cache with network_info: [{"id": "faecacb1-2e14-4d38-8c90-bfb3ff5e23a7", "address": "fa:16:3e:37:d2:3d", "network": {"id": "cb56461a-b7fd-4c8f-94b2-20029820dfd1", "bridge": "br-int", "label": "tempest-TestStampPattern-1598636493-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3b6c645430b147a0983e499772a43790", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfaecacb1-2e", "ovs_interfaceid": "faecacb1-2e14-4d38-8c90-bfb3ff5e23a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Releasing lock "refresh_cache-ea21f129-1e0a-4ad1-b516-c2a8f92835b7" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG nova.compute.manager [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Instance network_info: |[{"id": "faecacb1-2e14-4d38-8c90-bfb3ff5e23a7", "address": "fa:16:3e:37:d2:3d", "network": {"id": "cb56461a-b7fd-4c8f-94b2-20029820dfd1", "bridge": "br-int", "label": "tempest-TestStampPattern-1598636493-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3b6c645430b147a0983e499772a43790", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfaecacb1-2e", "ovs_interfaceid": "faecacb1-2e14-4d38-8c90-bfb3ff5e23a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70374) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-400c5804-cd73-4545-85c5-94fc45c9bea3 req-90afc4ed-8b83-469b-b355-43b8f07de3dd service nova] Acquired lock "refresh_cache-ea21f129-1e0a-4ad1-b516-c2a8f92835b7" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG nova.network.neutron [req-400c5804-cd73-4545-85c5-94fc45c9bea3 req-90afc4ed-8b83-469b-b355-43b8f07de3dd service nova] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Refreshing network info cache for port faecacb1-2e14-4d38-8c90-bfb3ff5e23a7 {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Start _get_guest_xml network_info=[{"id": "faecacb1-2e14-4d38-8c90-bfb3ff5e23a7", "address": "fa:16:3e:37:d2:3d", "network": {"id": "cb56461a-b7fd-4c8f-94b2-20029820dfd1", "bridge": "br-int", "label": "tempest-TestStampPattern-1598636493-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3b6c645430b147a0983e499772a43790", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfaecacb1-2e", "ovs_interfaceid": "faecacb1-2e14-4d38-8c90-bfb3ff5e23a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-07-27T09:26:27Z,direct_url=,disk_format='qcow2',id=35458adf-261a-4e0b-a4db-b243619b2394,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='37aff788807d4b2aafc10d355583f7c7',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-07-27T09:26:29Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'size': 0, 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'image_id': '35458adf-261a-4e0b-a4db-b243619b2394'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70374) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Jul 27 09:31:54 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:31:54 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:31:54 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Searching host: 'user' for CPU controller through CGroups V1... {{(pid=70374) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] CPU controller missing on host. {{(pid=70374) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Searching host: 'user' for CPU controller through CGroups V2... {{(pid=70374) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] CPU controller found on host. {{(pid=70374) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70374) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-07-27T09:26:16Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1',id=6,is_public=True,memory_mb=512,name='m1.tiny',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-07-27T09:26:27Z,direct_url=,disk_format='qcow2',id=35458adf-261a-4e0b-a4db-b243619b2394,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='37aff788807d4b2aafc10d355583f7c7',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-07-27T09:26:29Z,virtual_size=,visibility=), allow threads: True {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Flavor limits 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Image limits 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Flavor pref 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Image pref 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70374) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Got 1 possible topologies {{(pid=70374) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG nova.virt.libvirt.vif [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-07-27T09:31:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestStampPattern-server-161038782',display_name='tempest-TestStampPattern-server-161038782',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(6),hidden=False,host='user',hostname='tempest-teststamppattern-server-161038782',id=12,image_ref='35458adf-261a-4e0b-a4db-b243619b2394',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIj2SFnPDau6vQCtjDQG1ZxRhv34b+wb9sn5YbvH/SZ654hrAREIUSioV28ip+GznAtPzx4FbU600mwPtKiSdblyqS0Oc7MrVQ7evq+7Ird+40EGlZ1xk19BS7xqljznJA==',key_name='tempest-TestStampPattern-355396374',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3b6c645430b147a0983e499772a43790',ramdisk_id='',reservation_id='r-siobf74j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='35458adf-261a-4e0b-a4db-b243619b2394',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-TestStampPattern-795018650',owner_user_name='tempest-TestStampPattern-795018650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-07-27T09:31:53Z,user_data=None,user_id='fa672f231aae45d6a68bddddf583dcde',uuid=ea21f129-1e0a-4ad1-b516-c2a8f92835b7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "faecacb1-2e14-4d38-8c90-bfb3ff5e23a7", "address": "fa:16:3e:37:d2:3d", "network": {"id": "cb56461a-b7fd-4c8f-94b2-20029820dfd1", "bridge": "br-int", "label": "tempest-TestStampPattern-1598636493-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3b6c645430b147a0983e499772a43790", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfaecacb1-2e", "ovs_interfaceid": "faecacb1-2e14-4d38-8c90-bfb3ff5e23a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70374) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Converting VIF {"id": "faecacb1-2e14-4d38-8c90-bfb3ff5e23a7", "address": "fa:16:3e:37:d2:3d", "network": {"id": "cb56461a-b7fd-4c8f-94b2-20029820dfd1", "bridge": "br-int", "label": "tempest-TestStampPattern-1598636493-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3b6c645430b147a0983e499772a43790", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfaecacb1-2e", "ovs_interfaceid": "faecacb1-2e14-4d38-8c90-bfb3ff5e23a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:37:d2:3d,bridge_name='br-int',has_traffic_filtering=True,id=faecacb1-2e14-4d38-8c90-bfb3ff5e23a7,network=Network(cb56461a-b7fd-4c8f-94b2-20029820dfd1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfaecacb1-2e') {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG nova.objects.instance [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Lazy-loading 'pci_devices' on Instance uuid ea21f129-1e0a-4ad1-b516-c2a8f92835b7 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] End _get_guest_xml xml= Jul 27 09:31:54 user nova-compute[70374]: ea21f129-1e0a-4ad1-b516-c2a8f92835b7 Jul 27 09:31:54 user nova-compute[70374]: instance-0000000c Jul 27 09:31:54 user nova-compute[70374]: 524288 Jul 27 09:31:54 user nova-compute[70374]: 1 Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: tempest-TestStampPattern-server-161038782 Jul 27 09:31:54 user nova-compute[70374]: 2023-07-27 09:31:54 Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: 512 Jul 27 09:31:54 user nova-compute[70374]: 1 Jul 27 09:31:54 user nova-compute[70374]: 0 Jul 27 09:31:54 user nova-compute[70374]: 0 Jul 27 09:31:54 user nova-compute[70374]: 1 Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: tempest-TestStampPattern-795018650-project-member Jul 27 09:31:54 user nova-compute[70374]: tempest-TestStampPattern-795018650 Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: OpenStack Foundation Jul 27 09:31:54 user nova-compute[70374]: OpenStack Nova Jul 27 09:31:54 user nova-compute[70374]: 0.0.0 Jul 27 09:31:54 user nova-compute[70374]: ea21f129-1e0a-4ad1-b516-c2a8f92835b7 Jul 27 09:31:54 user nova-compute[70374]: ea21f129-1e0a-4ad1-b516-c2a8f92835b7 Jul 27 09:31:54 user nova-compute[70374]: Virtual Machine Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: hvm Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Nehalem Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: /dev/urandom Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: Jul 27 09:31:54 user nova-compute[70374]: {{(pid=70374) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG nova.virt.libvirt.vif [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-07-27T09:31:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestStampPattern-server-161038782',display_name='tempest-TestStampPattern-server-161038782',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(6),hidden=False,host='user',hostname='tempest-teststamppattern-server-161038782',id=12,image_ref='35458adf-261a-4e0b-a4db-b243619b2394',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIj2SFnPDau6vQCtjDQG1ZxRhv34b+wb9sn5YbvH/SZ654hrAREIUSioV28ip+GznAtPzx4FbU600mwPtKiSdblyqS0Oc7MrVQ7evq+7Ird+40EGlZ1xk19BS7xqljznJA==',key_name='tempest-TestStampPattern-355396374',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3b6c645430b147a0983e499772a43790',ramdisk_id='',reservation_id='r-siobf74j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='35458adf-261a-4e0b-a4db-b243619b2394',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-TestStampPattern-795018650',owner_user_name='tempest-TestStampPattern-795018650-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-07-27T09:31:53Z,user_data=None,user_id='fa672f231aae45d6a68bddddf583dcde',uuid=ea21f129-1e0a-4ad1-b516-c2a8f92835b7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "faecacb1-2e14-4d38-8c90-bfb3ff5e23a7", "address": "fa:16:3e:37:d2:3d", "network": {"id": "cb56461a-b7fd-4c8f-94b2-20029820dfd1", "bridge": "br-int", "label": "tempest-TestStampPattern-1598636493-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3b6c645430b147a0983e499772a43790", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfaecacb1-2e", "ovs_interfaceid": "faecacb1-2e14-4d38-8c90-bfb3ff5e23a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Converting VIF {"id": "faecacb1-2e14-4d38-8c90-bfb3ff5e23a7", "address": "fa:16:3e:37:d2:3d", "network": {"id": "cb56461a-b7fd-4c8f-94b2-20029820dfd1", "bridge": "br-int", "label": "tempest-TestStampPattern-1598636493-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3b6c645430b147a0983e499772a43790", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfaecacb1-2e", "ovs_interfaceid": "faecacb1-2e14-4d38-8c90-bfb3ff5e23a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:37:d2:3d,bridge_name='br-int',has_traffic_filtering=True,id=faecacb1-2e14-4d38-8c90-bfb3ff5e23a7,network=Network(cb56461a-b7fd-4c8f-94b2-20029820dfd1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfaecacb1-2e') {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG os_vif [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:37:d2:3d,bridge_name='br-int',has_traffic_filtering=True,id=faecacb1-2e14-4d38-8c90-bfb3ff5e23a7,network=Network(cb56461a-b7fd-4c8f-94b2-20029820dfd1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfaecacb1-2e') {{(pid=70374) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfaecacb1-2e, may_exist=True, interface_attrs={}) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfaecacb1-2e, col_values=(('external_ids', {'iface-id': 'faecacb1-2e14-4d38-8c90-bfb3ff5e23a7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:37:d2:3d', 'vm-uuid': 'ea21f129-1e0a-4ad1-b516-c2a8f92835b7'}),), if_exists=True) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:54 user nova-compute[70374]: INFO os_vif [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:37:d2:3d,bridge_name='br-int',has_traffic_filtering=True,id=faecacb1-2e14-4d38-8c90-bfb3ff5e23a7,network=Network(cb56461a-b7fd-4c8f-94b2-20029820dfd1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfaecacb1-2e') Jul 27 09:31:54 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] No BDM found with device name vda, not building metadata. {{(pid=70374) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Jul 27 09:31:54 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] No VIF found with MAC fa:16:3e:37:d2:3d, not building metadata {{(pid=70374) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Jul 27 09:31:55 user nova-compute[70374]: DEBUG nova.network.neutron [req-400c5804-cd73-4545-85c5-94fc45c9bea3 req-90afc4ed-8b83-469b-b355-43b8f07de3dd service nova] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Updated VIF entry in instance network info cache for port faecacb1-2e14-4d38-8c90-bfb3ff5e23a7. {{(pid=70374) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Jul 27 09:31:55 user nova-compute[70374]: DEBUG nova.network.neutron [req-400c5804-cd73-4545-85c5-94fc45c9bea3 req-90afc4ed-8b83-469b-b355-43b8f07de3dd service nova] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Updating instance_info_cache with network_info: [{"id": "faecacb1-2e14-4d38-8c90-bfb3ff5e23a7", "address": "fa:16:3e:37:d2:3d", "network": {"id": "cb56461a-b7fd-4c8f-94b2-20029820dfd1", "bridge": "br-int", "label": "tempest-TestStampPattern-1598636493-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3b6c645430b147a0983e499772a43790", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfaecacb1-2e", "ovs_interfaceid": "faecacb1-2e14-4d38-8c90-bfb3ff5e23a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:31:55 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-400c5804-cd73-4545-85c5-94fc45c9bea3 req-90afc4ed-8b83-469b-b355-43b8f07de3dd service nova] Releasing lock "refresh_cache-ea21f129-1e0a-4ad1-b516-c2a8f92835b7" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:31:56 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:56 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:56 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:56 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:56 user nova-compute[70374]: DEBUG nova.compute.manager [req-cb90a621-9927-4054-8a57-1d1df087a54e req-54b627f2-948e-4570-8870-91eaf468dd0f service nova] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Received event network-vif-plugged-faecacb1-2e14-4d38-8c90-bfb3ff5e23a7 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:31:56 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-cb90a621-9927-4054-8a57-1d1df087a54e req-54b627f2-948e-4570-8870-91eaf468dd0f service nova] Acquiring lock "ea21f129-1e0a-4ad1-b516-c2a8f92835b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:31:56 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-cb90a621-9927-4054-8a57-1d1df087a54e req-54b627f2-948e-4570-8870-91eaf468dd0f service nova] Lock "ea21f129-1e0a-4ad1-b516-c2a8f92835b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:31:56 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-cb90a621-9927-4054-8a57-1d1df087a54e req-54b627f2-948e-4570-8870-91eaf468dd0f service nova] Lock "ea21f129-1e0a-4ad1-b516-c2a8f92835b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:31:56 user nova-compute[70374]: DEBUG nova.compute.manager [req-cb90a621-9927-4054-8a57-1d1df087a54e req-54b627f2-948e-4570-8870-91eaf468dd0f service nova] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] No waiting events found dispatching network-vif-plugged-faecacb1-2e14-4d38-8c90-bfb3ff5e23a7 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:31:56 user nova-compute[70374]: WARNING nova.compute.manager [req-cb90a621-9927-4054-8a57-1d1df087a54e req-54b627f2-948e-4570-8870-91eaf468dd0f service nova] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Received unexpected event network-vif-plugged-faecacb1-2e14-4d38-8c90-bfb3ff5e23a7 for instance with vm_state building and task_state spawning. Jul 27 09:31:56 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:56 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:56 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:56 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:58 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Acquiring lock "a72e9d21-b1f4-4e61-8875-c72d7ce88a32" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:31:58 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Lock "a72e9d21-b1f4-4e61-8875-c72d7ce88a32" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:31:58 user nova-compute[70374]: DEBUG nova.compute.manager [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Starting instance... {{(pid=70374) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Jul 27 09:31:58 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:31:58 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:31:58 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70374) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Jul 27 09:31:58 user nova-compute[70374]: INFO nova.compute.claims [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Claim successful on node user Jul 27 09:31:58 user nova-compute[70374]: DEBUG nova.virt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Emitting event Resumed> {{(pid=70374) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Jul 27 09:31:58 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] VM Resumed (Lifecycle Event) Jul 27 09:31:58 user nova-compute[70374]: DEBUG nova.compute.manager [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Instance event wait completed in 0 seconds for {{(pid=70374) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Jul 27 09:31:58 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Guest created on hypervisor {{(pid=70374) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4411}} Jul 27 09:31:58 user nova-compute[70374]: INFO nova.virt.libvirt.driver [-] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Instance spawned successfully. Jul 27 09:31:58 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:894}} Jul 27 09:31:58 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:31:58 user nova-compute[70374]: DEBUG nova.compute.manager [req-81bcf4ae-20a2-4263-a39e-5a90cf3ea2f6 req-d012ed75-9386-4086-b150-64ba05ad2647 service nova] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Received event network-vif-plugged-faecacb1-2e14-4d38-8c90-bfb3ff5e23a7 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:31:58 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-81bcf4ae-20a2-4263-a39e-5a90cf3ea2f6 req-d012ed75-9386-4086-b150-64ba05ad2647 service nova] Acquiring lock "ea21f129-1e0a-4ad1-b516-c2a8f92835b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:31:58 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-81bcf4ae-20a2-4263-a39e-5a90cf3ea2f6 req-d012ed75-9386-4086-b150-64ba05ad2647 service nova] Lock "ea21f129-1e0a-4ad1-b516-c2a8f92835b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:31:58 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-81bcf4ae-20a2-4263-a39e-5a90cf3ea2f6 req-d012ed75-9386-4086-b150-64ba05ad2647 service nova] Lock "ea21f129-1e0a-4ad1-b516-c2a8f92835b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:31:58 user nova-compute[70374]: DEBUG nova.compute.manager [req-81bcf4ae-20a2-4263-a39e-5a90cf3ea2f6 req-d012ed75-9386-4086-b150-64ba05ad2647 service nova] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] No waiting events found dispatching network-vif-plugged-faecacb1-2e14-4d38-8c90-bfb3ff5e23a7 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:31:58 user nova-compute[70374]: WARNING nova.compute.manager [req-81bcf4ae-20a2-4263-a39e-5a90cf3ea2f6 req-d012ed75-9386-4086-b150-64ba05ad2647 service nova] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Received unexpected event network-vif-plugged-faecacb1-2e14-4d38-8c90-bfb3ff5e23a7 for instance with vm_state building and task_state spawning. Jul 27 09:31:58 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70374) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Jul 27 09:31:58 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Found default for hw_cdrom_bus of ide {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:31:58 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Found default for hw_disk_bus of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:31:58 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Found default for hw_input_bus of None {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:31:58 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Found default for hw_pointer_model of None {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:31:58 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Found default for hw_video_model of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:31:58 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Found default for hw_vif_model of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:31:58 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] During sync_power_state the instance has a pending task (spawning). Skip. Jul 27 09:31:58 user nova-compute[70374]: DEBUG nova.virt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Emitting event Started> {{(pid=70374) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Jul 27 09:31:58 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] VM Started (Lifecycle Event) Jul 27 09:31:58 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:31:58 user nova-compute[70374]: INFO nova.compute.manager [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Took 5.75 seconds to spawn the instance on the hypervisor. Jul 27 09:31:58 user nova-compute[70374]: DEBUG nova.compute.manager [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:31:58 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70374) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Jul 27 09:31:58 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] During sync_power_state the instance has a pending task (spawning). Skip. Jul 27 09:31:58 user nova-compute[70374]: INFO nova.compute.manager [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] [instance: ea21f129-1e0a-4ad1-b516-c2a8f92835b7] Took 6.54 seconds to build instance. Jul 27 09:31:58 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-14c59844-099e-4337-9779-48a202e18404 tempest-TestStampPattern-795018650 tempest-TestStampPattern-795018650-project-member] Lock "ea21f129-1e0a-4ad1-b516-c2a8f92835b7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.657s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:31:59 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:59 user nova-compute[70374]: DEBUG nova.compute.provider_tree [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Inventory has not changed in ProviderTree for provider: f7548644-4a09-4ad8-9aa6-6e05d85a9f5b {{(pid=70374) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Jul 27 09:31:59 user nova-compute[70374]: DEBUG nova.scheduler.client.report [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Inventory has not changed for provider f7548644-4a09-4ad8-9aa6-6e05d85a9f5b based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16011, 'reserved': 512, 'min_unit': 1, 'max_unit': 16011, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70374) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Jul 27 09:31:59 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.601s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:31:59 user nova-compute[70374]: DEBUG nova.compute.manager [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Start building networks asynchronously for instance. {{(pid=70374) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Jul 27 09:31:59 user nova-compute[70374]: DEBUG nova.compute.manager [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Allocating IP information in the background. {{(pid=70374) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} Jul 27 09:31:59 user nova-compute[70374]: DEBUG nova.network.neutron [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] allocate_for_instance() {{(pid=70374) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Jul 27 09:31:59 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Jul 27 09:31:59 user nova-compute[70374]: DEBUG nova.compute.manager [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Start building block device mappings for instance. {{(pid=70374) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Jul 27 09:31:59 user nova-compute[70374]: DEBUG nova.policy [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '544aa4a40ff941c2a52b24faa08a0d29', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '65e935bb36794aa98eb3e94c30e647d9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70374) authorize /opt/stack/nova/nova/policy.py:203}} Jul 27 09:31:59 user nova-compute[70374]: DEBUG nova.compute.manager [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Start spawning the instance on the hypervisor. {{(pid=70374) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Jul 27 09:31:59 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Creating instance directory {{(pid=70374) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Jul 27 09:31:59 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Creating image(s) Jul 27 09:31:59 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Acquiring lock "/opt/stack/data/nova/instances/a72e9d21-b1f4-4e61-8875-c72d7ce88a32/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:31:59 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Lock "/opt/stack/data/nova/instances/a72e9d21-b1f4-4e61-8875-c72d7ce88a32/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:31:59 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Lock "/opt/stack/data/nova/instances/a72e9d21-b1f4-4e61-8875-c72d7ce88a32/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:31:59 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:31:59 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.166s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:31:59 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Acquiring lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:31:59 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:31:59 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:31:59 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.156s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:31:59 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1,backing_fmt=raw /opt/stack/data/nova/instances/a72e9d21-b1f4-4e61-8875-c72d7ce88a32/disk 1073741824 {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:31:59 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1,backing_fmt=raw /opt/stack/data/nova/instances/a72e9d21-b1f4-4e61-8875-c72d7ce88a32/disk 1073741824" returned: 0 in 0.058s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:31:59 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.222s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:31:59 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:31:59 user nova-compute[70374]: DEBUG nova.network.neutron [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Successfully created port: 75f39855-7414-4b71-a4a8-553502aa8627 {{(pid=70374) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} Jul 27 09:31:59 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:31:59 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.180s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:31:59 user nova-compute[70374]: DEBUG nova.virt.disk.api [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Checking if we can resize image /opt/stack/data/nova/instances/a72e9d21-b1f4-4e61-8875-c72d7ce88a32/disk. size=1073741824 {{(pid=70374) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Jul 27 09:31:59 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a72e9d21-b1f4-4e61-8875-c72d7ce88a32/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:00 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a72e9d21-b1f4-4e61-8875-c72d7ce88a32/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:00 user nova-compute[70374]: DEBUG nova.virt.disk.api [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Cannot resize image /opt/stack/data/nova/instances/a72e9d21-b1f4-4e61-8875-c72d7ce88a32/disk to a smaller size. {{(pid=70374) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Jul 27 09:32:00 user nova-compute[70374]: DEBUG nova.objects.instance [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Lazy-loading 'migration_context' on Instance uuid a72e9d21-b1f4-4e61-8875-c72d7ce88a32 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:32:00 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Created local disks {{(pid=70374) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Jul 27 09:32:00 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Ensure instance console log exists: /opt/stack/data/nova/instances/a72e9d21-b1f4-4e61-8875-c72d7ce88a32/console.log {{(pid=70374) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Jul 27 09:32:00 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:00 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:00 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG nova.network.neutron [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Successfully updated port: 75f39855-7414-4b71-a4a8-553502aa8627 {{(pid=70374) _update_port /opt/stack/nova/nova/network/neutron.py:586}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Acquiring lock "refresh_cache-a72e9d21-b1f4-4e61-8875-c72d7ce88a32" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Acquired lock "refresh_cache-a72e9d21-b1f4-4e61-8875-c72d7ce88a32" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG nova.network.neutron [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Building network info cache for instance {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG nova.network.neutron [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Instance cache missing network info. {{(pid=70374) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG nova.compute.manager [req-2fe06278-d4f7-4f30-801c-a2f18a8c0d52 req-4fe8af49-3a6e-49b7-8532-433b5086353b service nova] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Received event network-changed-75f39855-7414-4b71-a4a8-553502aa8627 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG nova.compute.manager [req-2fe06278-d4f7-4f30-801c-a2f18a8c0d52 req-4fe8af49-3a6e-49b7-8532-433b5086353b service nova] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Refreshing instance network info cache due to event network-changed-75f39855-7414-4b71-a4a8-553502aa8627. {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-2fe06278-d4f7-4f30-801c-a2f18a8c0d52 req-4fe8af49-3a6e-49b7-8532-433b5086353b service nova] Acquiring lock "refresh_cache-a72e9d21-b1f4-4e61-8875-c72d7ce88a32" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG nova.network.neutron [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Updating instance_info_cache with network_info: [{"id": "75f39855-7414-4b71-a4a8-553502aa8627", "address": "fa:16:3e:34:5c:8c", "network": {"id": "e1a0003b-0f63-4efa-9d24-933f88eb0373", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-651468270-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "65e935bb36794aa98eb3e94c30e647d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap75f39855-74", "ovs_interfaceid": "75f39855-7414-4b71-a4a8-553502aa8627", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Releasing lock "refresh_cache-a72e9d21-b1f4-4e61-8875-c72d7ce88a32" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG nova.compute.manager [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Instance network_info: |[{"id": "75f39855-7414-4b71-a4a8-553502aa8627", "address": "fa:16:3e:34:5c:8c", "network": {"id": "e1a0003b-0f63-4efa-9d24-933f88eb0373", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-651468270-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "65e935bb36794aa98eb3e94c30e647d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap75f39855-74", "ovs_interfaceid": "75f39855-7414-4b71-a4a8-553502aa8627", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70374) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-2fe06278-d4f7-4f30-801c-a2f18a8c0d52 req-4fe8af49-3a6e-49b7-8532-433b5086353b service nova] Acquired lock "refresh_cache-a72e9d21-b1f4-4e61-8875-c72d7ce88a32" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG nova.network.neutron [req-2fe06278-d4f7-4f30-801c-a2f18a8c0d52 req-4fe8af49-3a6e-49b7-8532-433b5086353b service nova] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Refreshing network info cache for port 75f39855-7414-4b71-a4a8-553502aa8627 {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Start _get_guest_xml network_info=[{"id": "75f39855-7414-4b71-a4a8-553502aa8627", "address": "fa:16:3e:34:5c:8c", "network": {"id": "e1a0003b-0f63-4efa-9d24-933f88eb0373", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-651468270-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "65e935bb36794aa98eb3e94c30e647d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap75f39855-74", "ovs_interfaceid": "75f39855-7414-4b71-a4a8-553502aa8627", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-07-27T09:26:27Z,direct_url=,disk_format='qcow2',id=35458adf-261a-4e0b-a4db-b243619b2394,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='37aff788807d4b2aafc10d355583f7c7',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-07-27T09:26:29Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'size': 0, 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'image_id': '35458adf-261a-4e0b-a4db-b243619b2394'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70374) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Jul 27 09:32:01 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:32:01 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:32:01 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Searching host: 'user' for CPU controller through CGroups V1... {{(pid=70374) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] CPU controller missing on host. {{(pid=70374) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Searching host: 'user' for CPU controller through CGroups V2... {{(pid=70374) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] CPU controller found on host. {{(pid=70374) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70374) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-07-27T09:26:16Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1',id=6,is_public=True,memory_mb=512,name='m1.tiny',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-07-27T09:26:27Z,direct_url=,disk_format='qcow2',id=35458adf-261a-4e0b-a4db-b243619b2394,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='37aff788807d4b2aafc10d355583f7c7',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-07-27T09:26:29Z,virtual_size=,visibility=), allow threads: True {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Flavor limits 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Image limits 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Flavor pref 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Image pref 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70374) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Got 1 possible topologies {{(pid=70374) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG nova.virt.libvirt.vif [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-07-27T09:31:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1038287225',display_name='tempest-ServersNegativeTestJSON-server-1038287225',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(6),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1038287225',id=13,image_ref='35458adf-261a-4e0b-a4db-b243619b2394',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='65e935bb36794aa98eb3e94c30e647d9',ramdisk_id='',reservation_id='r-ocrv8sa7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='35458adf-261a-4e0b-a4db-b243619b2394',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1865798359',owner_user_name='tempest-ServersNegativeTestJSON-1865798359-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-07-27T09:31:59Z,user_data=None,user_id='544aa4a40ff941c2a52b24faa08a0d29',uuid=a72e9d21-b1f4-4e61-8875-c72d7ce88a32,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "75f39855-7414-4b71-a4a8-553502aa8627", "address": "fa:16:3e:34:5c:8c", "network": {"id": "e1a0003b-0f63-4efa-9d24-933f88eb0373", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-651468270-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "65e935bb36794aa98eb3e94c30e647d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap75f39855-74", "ovs_interfaceid": "75f39855-7414-4b71-a4a8-553502aa8627", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70374) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Converting VIF {"id": "75f39855-7414-4b71-a4a8-553502aa8627", "address": "fa:16:3e:34:5c:8c", "network": {"id": "e1a0003b-0f63-4efa-9d24-933f88eb0373", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-651468270-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "65e935bb36794aa98eb3e94c30e647d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap75f39855-74", "ovs_interfaceid": "75f39855-7414-4b71-a4a8-553502aa8627", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:5c:8c,bridge_name='br-int',has_traffic_filtering=True,id=75f39855-7414-4b71-a4a8-553502aa8627,network=Network(e1a0003b-0f63-4efa-9d24-933f88eb0373),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75f39855-74') {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG nova.objects.instance [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Lazy-loading 'pci_devices' on Instance uuid a72e9d21-b1f4-4e61-8875-c72d7ce88a32 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] End _get_guest_xml xml= Jul 27 09:32:01 user nova-compute[70374]: a72e9d21-b1f4-4e61-8875-c72d7ce88a32 Jul 27 09:32:01 user nova-compute[70374]: instance-0000000d Jul 27 09:32:01 user nova-compute[70374]: 524288 Jul 27 09:32:01 user nova-compute[70374]: 1 Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: tempest-ServersNegativeTestJSON-server-1038287225 Jul 27 09:32:01 user nova-compute[70374]: 2023-07-27 09:32:01 Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: 512 Jul 27 09:32:01 user nova-compute[70374]: 1 Jul 27 09:32:01 user nova-compute[70374]: 0 Jul 27 09:32:01 user nova-compute[70374]: 0 Jul 27 09:32:01 user nova-compute[70374]: 1 Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: tempest-ServersNegativeTestJSON-1865798359-project-member Jul 27 09:32:01 user nova-compute[70374]: tempest-ServersNegativeTestJSON-1865798359 Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: OpenStack Foundation Jul 27 09:32:01 user nova-compute[70374]: OpenStack Nova Jul 27 09:32:01 user nova-compute[70374]: 0.0.0 Jul 27 09:32:01 user nova-compute[70374]: a72e9d21-b1f4-4e61-8875-c72d7ce88a32 Jul 27 09:32:01 user nova-compute[70374]: a72e9d21-b1f4-4e61-8875-c72d7ce88a32 Jul 27 09:32:01 user nova-compute[70374]: Virtual Machine Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: hvm Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Nehalem Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: /dev/urandom Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: Jul 27 09:32:01 user nova-compute[70374]: {{(pid=70374) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG nova.virt.libvirt.vif [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-07-27T09:31:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1038287225',display_name='tempest-ServersNegativeTestJSON-server-1038287225',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(6),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1038287225',id=13,image_ref='35458adf-261a-4e0b-a4db-b243619b2394',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='65e935bb36794aa98eb3e94c30e647d9',ramdisk_id='',reservation_id='r-ocrv8sa7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='35458adf-261a-4e0b-a4db-b243619b2394',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1865798359',owner_user_name='tempest-ServersNegativeTestJSON-1865798359-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-07-27T09:31:59Z,user_data=None,user_id='544aa4a40ff941c2a52b24faa08a0d29',uuid=a72e9d21-b1f4-4e61-8875-c72d7ce88a32,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "75f39855-7414-4b71-a4a8-553502aa8627", "address": "fa:16:3e:34:5c:8c", "network": {"id": "e1a0003b-0f63-4efa-9d24-933f88eb0373", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-651468270-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "65e935bb36794aa98eb3e94c30e647d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap75f39855-74", "ovs_interfaceid": "75f39855-7414-4b71-a4a8-553502aa8627", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Converting VIF {"id": "75f39855-7414-4b71-a4a8-553502aa8627", "address": "fa:16:3e:34:5c:8c", "network": {"id": "e1a0003b-0f63-4efa-9d24-933f88eb0373", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-651468270-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "65e935bb36794aa98eb3e94c30e647d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap75f39855-74", "ovs_interfaceid": "75f39855-7414-4b71-a4a8-553502aa8627", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:5c:8c,bridge_name='br-int',has_traffic_filtering=True,id=75f39855-7414-4b71-a4a8-553502aa8627,network=Network(e1a0003b-0f63-4efa-9d24-933f88eb0373),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75f39855-74') {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG os_vif [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:5c:8c,bridge_name='br-int',has_traffic_filtering=True,id=75f39855-7414-4b71-a4a8-553502aa8627,network=Network(e1a0003b-0f63-4efa-9d24-933f88eb0373),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75f39855-74') {{(pid=70374) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap75f39855-74, may_exist=True, interface_attrs={}) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap75f39855-74, col_values=(('external_ids', {'iface-id': '75f39855-7414-4b71-a4a8-553502aa8627', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:34:5c:8c', 'vm-uuid': 'a72e9d21-b1f4-4e61-8875-c72d7ce88a32'}),), if_exists=True) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:01 user nova-compute[70374]: INFO os_vif [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:5c:8c,bridge_name='br-int',has_traffic_filtering=True,id=75f39855-7414-4b71-a4a8-553502aa8627,network=Network(e1a0003b-0f63-4efa-9d24-933f88eb0373),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75f39855-74') Jul 27 09:32:01 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] No BDM found with device name vda, not building metadata. {{(pid=70374) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Jul 27 09:32:01 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] No VIF found with MAC fa:16:3e:34:5c:8c, not building metadata {{(pid=70374) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Jul 27 09:32:02 user nova-compute[70374]: DEBUG nova.network.neutron [req-2fe06278-d4f7-4f30-801c-a2f18a8c0d52 req-4fe8af49-3a6e-49b7-8532-433b5086353b service nova] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Updated VIF entry in instance network info cache for port 75f39855-7414-4b71-a4a8-553502aa8627. {{(pid=70374) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Jul 27 09:32:02 user nova-compute[70374]: DEBUG nova.network.neutron [req-2fe06278-d4f7-4f30-801c-a2f18a8c0d52 req-4fe8af49-3a6e-49b7-8532-433b5086353b service nova] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Updating instance_info_cache with network_info: [{"id": "75f39855-7414-4b71-a4a8-553502aa8627", "address": "fa:16:3e:34:5c:8c", "network": {"id": "e1a0003b-0f63-4efa-9d24-933f88eb0373", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-651468270-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "65e935bb36794aa98eb3e94c30e647d9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap75f39855-74", "ovs_interfaceid": "75f39855-7414-4b71-a4a8-553502aa8627", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:32:02 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-2fe06278-d4f7-4f30-801c-a2f18a8c0d52 req-4fe8af49-3a6e-49b7-8532-433b5086353b service nova] Releasing lock "refresh_cache-a72e9d21-b1f4-4e61-8875-c72d7ce88a32" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:32:02 user nova-compute[70374]: INFO nova.compute.manager [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Rescuing Jul 27 09:32:02 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Acquiring lock "refresh_cache-6ae93f34-ce7e-4ae4-a5ba-36508361bd54" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:32:02 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Acquired lock "refresh_cache-6ae93f34-ce7e-4ae4-a5ba-36508361bd54" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:32:02 user nova-compute[70374]: DEBUG nova.network.neutron [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Building network info cache for instance {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Jul 27 09:32:03 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:03 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:03 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:03 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:32:03 user nova-compute[70374]: DEBUG nova.compute.manager [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Starting heal instance info cache {{(pid=70374) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9842}} Jul 27 09:32:03 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:03 user nova-compute[70374]: DEBUG nova.compute.manager [req-7965d2ce-aad0-4ae9-8205-d52f299cfbb3 req-d743d8be-cc09-42e8-b8d6-7dc86004bcc4 service nova] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Received event network-vif-plugged-75f39855-7414-4b71-a4a8-553502aa8627 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:32:03 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-7965d2ce-aad0-4ae9-8205-d52f299cfbb3 req-d743d8be-cc09-42e8-b8d6-7dc86004bcc4 service nova] Acquiring lock "a72e9d21-b1f4-4e61-8875-c72d7ce88a32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:03 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-7965d2ce-aad0-4ae9-8205-d52f299cfbb3 req-d743d8be-cc09-42e8-b8d6-7dc86004bcc4 service nova] Lock "a72e9d21-b1f4-4e61-8875-c72d7ce88a32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:03 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-7965d2ce-aad0-4ae9-8205-d52f299cfbb3 req-d743d8be-cc09-42e8-b8d6-7dc86004bcc4 service nova] Lock "a72e9d21-b1f4-4e61-8875-c72d7ce88a32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:03 user nova-compute[70374]: DEBUG nova.compute.manager [req-7965d2ce-aad0-4ae9-8205-d52f299cfbb3 req-d743d8be-cc09-42e8-b8d6-7dc86004bcc4 service nova] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] No waiting events found dispatching network-vif-plugged-75f39855-7414-4b71-a4a8-553502aa8627 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:32:03 user nova-compute[70374]: WARNING nova.compute.manager [req-7965d2ce-aad0-4ae9-8205-d52f299cfbb3 req-d743d8be-cc09-42e8-b8d6-7dc86004bcc4 service nova] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Received unexpected event network-vif-plugged-75f39855-7414-4b71-a4a8-553502aa8627 for instance with vm_state building and task_state spawning. Jul 27 09:32:03 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Acquiring lock "refresh_cache-fb5ccac9-1e45-4726-b681-cf34cf3fa521" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:32:03 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Acquired lock "refresh_cache-fb5ccac9-1e45-4726-b681-cf34cf3fa521" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:32:03 user nova-compute[70374]: DEBUG nova.network.neutron [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Forcefully refreshing network info cache for instance {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1996}} Jul 27 09:32:03 user nova-compute[70374]: DEBUG nova.network.neutron [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Updating instance_info_cache with network_info: [{"id": "ce2809b5-cc30-4a29-9770-7df186b16406", "address": "fa:16:3e:ef:0f:b2", "network": {"id": "de376ae8-a0af-4eda-95e8-dfb1ca0f2283", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1261629972-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8acda82ef3f428fbb93847922a213d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapce2809b5-cc", "ovs_interfaceid": "ce2809b5-cc30-4a29-9770-7df186b16406", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:32:03 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Releasing lock "refresh_cache-6ae93f34-ce7e-4ae4-a5ba-36508361bd54" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:32:03 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:03 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG nova.network.neutron [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Updating instance_info_cache with network_info: [{"id": "fd47b104-c8ff-4b76-8f3a-53725f9f318c", "address": "fa:16:3e:57:cb:7b", "network": {"id": "de376ae8-a0af-4eda-95e8-dfb1ca0f2283", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1261629972-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8acda82ef3f428fbb93847922a213d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd47b104-c8", "ovs_interfaceid": "fd47b104-c8ff-4b76-8f3a-53725f9f318c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Releasing lock "refresh_cache-fb5ccac9-1e45-4726-b681-cf34cf3fa521" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG nova.compute.manager [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] [instance: fb5ccac9-1e45-4726-b681-cf34cf3fa521] Updated the network info_cache for instance {{(pid=70374) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager.update_available_resource {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Auditing locally available compute resources for user (node: user) {{(pid=70374) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} Jul 27 09:32:04 user nova-compute[70374]: INFO nova.virt.libvirt.driver [-] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Instance destroyed successfully. Jul 27 09:32:04 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Attempting rescue Jul 27 09:32:04 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} {{(pid=70374) rescue /opt/stack/nova/nova/virt/libvirt/driver.py:4308}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Instance directory exists: not creating {{(pid=70374) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4713}} Jul 27 09:32:04 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Creating image(s) Jul 27 09:32:04 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Acquiring lock "/opt/stack/data/nova/instances/6ae93f34-ce7e-4ae4-a5ba-36508361bd54/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Lock "/opt/stack/data/nova/instances/6ae93f34-ce7e-4ae4-a5ba-36508361bd54/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Lock "/opt/stack/data/nova/instances/6ae93f34-ce7e-4ae4-a5ba-36508361bd54/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG nova.objects.instance [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Lazy-loading 'trusted_certs' on Instance uuid 6ae93f34-ce7e-4ae4-a5ba-36508361bd54 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Acquiring lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.141s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1,backing_fmt=raw /opt/stack/data/nova/instances/6ae93f34-ce7e-4ae4-a5ba-36508361bd54/disk.rescue {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1,backing_fmt=raw /opt/stack/data/nova/instances/6ae93f34-ce7e-4ae4-a5ba-36508361bd54/disk.rescue" returned: 0 in 0.060s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.207s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG nova.objects.instance [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Lazy-loading 'migration_context' on Instance uuid 6ae93f34-ce7e-4ae4-a5ba-36508361bd54 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Created local disks {{(pid=70374) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Start _get_guest_xml network_info=[{"id": "ce2809b5-cc30-4a29-9770-7df186b16406", "address": "fa:16:3e:ef:0f:b2", "network": {"id": "de376ae8-a0af-4eda-95e8-dfb1ca0f2283", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1261629972-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1261629972-network", "vif_mac": "fa:16:3e:ef:0f:b2"}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8acda82ef3f428fbb93847922a213d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapce2809b5-cc", "ovs_interfaceid": "ce2809b5-cc30-4a29-9770-7df186b16406", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-07-27T09:26:27Z,direct_url=,disk_format='qcow2',id=35458adf-261a-4e0b-a4db-b243619b2394,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='37aff788807d4b2aafc10d355583f7c7',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-07-27T09:26:29Z,virtual_size=,visibility=) rescue={'image_id': '35458adf-261a-4e0b-a4db-b243619b2394', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None {{(pid=70374) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG nova.objects.instance [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Lazy-loading 'resources' on Instance uuid 6ae93f34-ce7e-4ae4-a5ba-36508361bd54 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG nova.objects.instance [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Lazy-loading 'numa_topology' on Instance uuid 6ae93f34-ce7e-4ae4-a5ba-36508361bd54 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/773917c6-56d7-4491-a760-05f51593b7f0/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:04 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:32:04 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:32:04 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Searching host: 'user' for CPU controller through CGroups V1... {{(pid=70374) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] CPU controller missing on host. {{(pid=70374) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Searching host: 'user' for CPU controller through CGroups V2... {{(pid=70374) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] CPU controller found on host. {{(pid=70374) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70374) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-07-27T09:26:16Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1',id=6,is_public=True,memory_mb=512,name='m1.tiny',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-07-27T09:26:27Z,direct_url=,disk_format='qcow2',id=35458adf-261a-4e0b-a4db-b243619b2394,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='37aff788807d4b2aafc10d355583f7c7',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-07-27T09:26:29Z,virtual_size=,visibility=), allow threads: True {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Flavor limits 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Image limits 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Flavor pref 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Image pref 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70374) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Got 1 possible topologies {{(pid=70374) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG nova.objects.instance [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Lazy-loading 'vcpu_model' on Instance uuid 6ae93f34-ce7e-4ae4-a5ba-36508361bd54 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG nova.virt.libvirt.vif [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-07-27T09:30:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1457979026',display_name='tempest-ServerRescueNegativeTestJSON-server-1457979026',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(6),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-1457979026',id=4,image_ref='35458adf-261a-4e0b-a4db-b243619b2394',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-07-27T09:30:30Z,launched_on='user',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='d8acda82ef3f428fbb93847922a213d1',ramdisk_id='',reservation_id='r-x508h1z9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='35458adf-261a-4e0b-a4db-b243619b2394',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerRescueNegativeTestJSON-241559119',owner_user_name='tempest-ServerRescueNegativeTestJSON-241559119-project-member'},tags=,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2023-07-27T09:30:30Z,user_data=None,user_id='d059c8dff3644f3c9c0c54498a4d78f7',uuid=6ae93f34-ce7e-4ae4-a5ba-36508361bd54,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ce2809b5-cc30-4a29-9770-7df186b16406", "address": "fa:16:3e:ef:0f:b2", "network": {"id": "de376ae8-a0af-4eda-95e8-dfb1ca0f2283", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1261629972-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1261629972-network", "vif_mac": "fa:16:3e:ef:0f:b2"}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8acda82ef3f428fbb93847922a213d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapce2809b5-cc", "ovs_interfaceid": "ce2809b5-cc30-4a29-9770-7df186b16406", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70374) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Converting VIF {"id": "ce2809b5-cc30-4a29-9770-7df186b16406", "address": "fa:16:3e:ef:0f:b2", "network": {"id": "de376ae8-a0af-4eda-95e8-dfb1ca0f2283", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1261629972-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1261629972-network", "vif_mac": "fa:16:3e:ef:0f:b2"}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8acda82ef3f428fbb93847922a213d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapce2809b5-cc", "ovs_interfaceid": "ce2809b5-cc30-4a29-9770-7df186b16406", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ef:0f:b2,bridge_name='br-int',has_traffic_filtering=True,id=ce2809b5-cc30-4a29-9770-7df186b16406,network=Network(de376ae8-a0af-4eda-95e8-dfb1ca0f2283),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce2809b5-cc') {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG nova.objects.instance [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] Lazy-loading 'pci_devices' on Instance uuid 6ae93f34-ce7e-4ae4-a5ba-36508361bd54 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] End _get_guest_xml xml= Jul 27 09:32:04 user nova-compute[70374]: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54 Jul 27 09:32:04 user nova-compute[70374]: instance-00000004 Jul 27 09:32:04 user nova-compute[70374]: 524288 Jul 27 09:32:04 user nova-compute[70374]: 1 Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: tempest-ServerRescueNegativeTestJSON-server-1457979026 Jul 27 09:32:04 user nova-compute[70374]: 2023-07-27 09:32:04 Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: 512 Jul 27 09:32:04 user nova-compute[70374]: 1 Jul 27 09:32:04 user nova-compute[70374]: 0 Jul 27 09:32:04 user nova-compute[70374]: 0 Jul 27 09:32:04 user nova-compute[70374]: 1 Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: tempest-ServerRescueNegativeTestJSON-241559119-project-member Jul 27 09:32:04 user nova-compute[70374]: tempest-ServerRescueNegativeTestJSON-241559119 Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: OpenStack Foundation Jul 27 09:32:04 user nova-compute[70374]: OpenStack Nova Jul 27 09:32:04 user nova-compute[70374]: 0.0.0 Jul 27 09:32:04 user nova-compute[70374]: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54 Jul 27 09:32:04 user nova-compute[70374]: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54 Jul 27 09:32:04 user nova-compute[70374]: Virtual Machine Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: hvm Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Nehalem Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: /dev/urandom Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: Jul 27 09:32:04 user nova-compute[70374]: {{(pid=70374) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Jul 27 09:32:04 user nova-compute[70374]: INFO nova.virt.libvirt.driver [-] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Instance destroyed successfully. Jul 27 09:32:04 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/773917c6-56d7-4491-a760-05f51593b7f0/disk --force-share --output=json" returned: 0 in 0.167s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/773917c6-56d7-4491-a760-05f51593b7f0/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] No BDM found with device name vda, not building metadata. {{(pid=70374) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] No BDM found with device name vdb, not building metadata. {{(pid=70374) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] No VIF found with MAC fa:16:3e:ef:0f:b2, not building metadata {{(pid=70374) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Jul 27 09:32:04 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/773917c6-56d7-4491-a760-05f51593b7f0/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:05 user nova-compute[70374]: DEBUG nova.virt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Emitting event Resumed> {{(pid=70374) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Jul 27 09:32:05 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] VM Resumed (Lifecycle Event) Jul 27 09:32:05 user nova-compute[70374]: DEBUG nova.compute.manager [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Instance event wait completed in 0 seconds for {{(pid=70374) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Jul 27 09:32:05 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Guest created on hypervisor {{(pid=70374) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4411}} Jul 27 09:32:05 user nova-compute[70374]: INFO nova.virt.libvirt.driver [-] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Instance spawned successfully. Jul 27 09:32:05 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:894}} Jul 27 09:32:05 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:32:05 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70374) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Jul 27 09:32:05 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Found default for hw_cdrom_bus of ide {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:32:05 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Found default for hw_disk_bus of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:32:05 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Found default for hw_input_bus of None {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:32:05 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Found default for hw_pointer_model of None {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:32:05 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Found default for hw_video_model of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:32:05 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Found default for hw_vif_model of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:32:05 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] During sync_power_state the instance has a pending task (spawning). Skip. Jul 27 09:32:05 user nova-compute[70374]: DEBUG nova.virt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Emitting event Started> {{(pid=70374) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Jul 27 09:32:05 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] VM Started (Lifecycle Event) Jul 27 09:32:05 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:32:05 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70374) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Jul 27 09:32:05 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] During sync_power_state the instance has a pending task (spawning). Skip. Jul 27 09:32:05 user nova-compute[70374]: INFO nova.compute.manager [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Took 6.09 seconds to spawn the instance on the hypervisor. Jul 27 09:32:05 user nova-compute[70374]: DEBUG nova.compute.manager [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:32:05 user nova-compute[70374]: DEBUG nova.compute.manager [req-bd4958a1-3a49-40f8-8cf4-aa8110977b72 req-1df54f40-ab3e-4205-81a9-7daf9965824e service nova] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Received event network-vif-plugged-75f39855-7414-4b71-a4a8-553502aa8627 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:32:05 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-bd4958a1-3a49-40f8-8cf4-aa8110977b72 req-1df54f40-ab3e-4205-81a9-7daf9965824e service nova] Acquiring lock "a72e9d21-b1f4-4e61-8875-c72d7ce88a32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:05 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-bd4958a1-3a49-40f8-8cf4-aa8110977b72 req-1df54f40-ab3e-4205-81a9-7daf9965824e service nova] Lock "a72e9d21-b1f4-4e61-8875-c72d7ce88a32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:05 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-bd4958a1-3a49-40f8-8cf4-aa8110977b72 req-1df54f40-ab3e-4205-81a9-7daf9965824e service nova] Lock "a72e9d21-b1f4-4e61-8875-c72d7ce88a32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:05 user nova-compute[70374]: DEBUG nova.compute.manager [req-bd4958a1-3a49-40f8-8cf4-aa8110977b72 req-1df54f40-ab3e-4205-81a9-7daf9965824e service nova] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] No waiting events found dispatching network-vif-plugged-75f39855-7414-4b71-a4a8-553502aa8627 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:32:05 user nova-compute[70374]: WARNING nova.compute.manager [req-bd4958a1-3a49-40f8-8cf4-aa8110977b72 req-1df54f40-ab3e-4205-81a9-7daf9965824e service nova] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Received unexpected event network-vif-plugged-75f39855-7414-4b71-a4a8-553502aa8627 for instance with vm_state building and task_state spawning. Jul 27 09:32:05 user nova-compute[70374]: DEBUG nova.compute.manager [req-bd4958a1-3a49-40f8-8cf4-aa8110977b72 req-1df54f40-ab3e-4205-81a9-7daf9965824e service nova] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Received event network-vif-unplugged-ce2809b5-cc30-4a29-9770-7df186b16406 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:32:05 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-bd4958a1-3a49-40f8-8cf4-aa8110977b72 req-1df54f40-ab3e-4205-81a9-7daf9965824e service nova] Acquiring lock "6ae93f34-ce7e-4ae4-a5ba-36508361bd54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:05 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-bd4958a1-3a49-40f8-8cf4-aa8110977b72 req-1df54f40-ab3e-4205-81a9-7daf9965824e service nova] Lock "6ae93f34-ce7e-4ae4-a5ba-36508361bd54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:05 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-bd4958a1-3a49-40f8-8cf4-aa8110977b72 req-1df54f40-ab3e-4205-81a9-7daf9965824e service nova] Lock "6ae93f34-ce7e-4ae4-a5ba-36508361bd54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:05 user nova-compute[70374]: DEBUG nova.compute.manager [req-bd4958a1-3a49-40f8-8cf4-aa8110977b72 req-1df54f40-ab3e-4205-81a9-7daf9965824e service nova] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] No waiting events found dispatching network-vif-unplugged-ce2809b5-cc30-4a29-9770-7df186b16406 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:32:05 user nova-compute[70374]: WARNING nova.compute.manager [req-bd4958a1-3a49-40f8-8cf4-aa8110977b72 req-1df54f40-ab3e-4205-81a9-7daf9965824e service nova] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Received unexpected event network-vif-unplugged-ce2809b5-cc30-4a29-9770-7df186b16406 for instance with vm_state active and task_state rescuing. Jul 27 09:32:05 user nova-compute[70374]: DEBUG nova.compute.manager [req-bd4958a1-3a49-40f8-8cf4-aa8110977b72 req-1df54f40-ab3e-4205-81a9-7daf9965824e service nova] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Received event network-vif-plugged-ce2809b5-cc30-4a29-9770-7df186b16406 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:32:05 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-bd4958a1-3a49-40f8-8cf4-aa8110977b72 req-1df54f40-ab3e-4205-81a9-7daf9965824e service nova] Acquiring lock "6ae93f34-ce7e-4ae4-a5ba-36508361bd54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:05 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-bd4958a1-3a49-40f8-8cf4-aa8110977b72 req-1df54f40-ab3e-4205-81a9-7daf9965824e service nova] Lock "6ae93f34-ce7e-4ae4-a5ba-36508361bd54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:05 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-bd4958a1-3a49-40f8-8cf4-aa8110977b72 req-1df54f40-ab3e-4205-81a9-7daf9965824e service nova] Lock "6ae93f34-ce7e-4ae4-a5ba-36508361bd54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:05 user nova-compute[70374]: DEBUG nova.compute.manager [req-bd4958a1-3a49-40f8-8cf4-aa8110977b72 req-1df54f40-ab3e-4205-81a9-7daf9965824e service nova] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] No waiting events found dispatching network-vif-plugged-ce2809b5-cc30-4a29-9770-7df186b16406 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:32:05 user nova-compute[70374]: WARNING nova.compute.manager [req-bd4958a1-3a49-40f8-8cf4-aa8110977b72 req-1df54f40-ab3e-4205-81a9-7daf9965824e service nova] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Received unexpected event network-vif-plugged-ce2809b5-cc30-4a29-9770-7df186b16406 for instance with vm_state active and task_state rescuing. Jul 27 09:32:05 user nova-compute[70374]: INFO nova.compute.manager [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] [instance: a72e9d21-b1f4-4e61-8875-c72d7ce88a32] Took 7.11 seconds to build instance. Jul 27 09:32:05 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-a6149f09-bd40-40f2-91e2-9582f3b42434 tempest-ServersNegativeTestJSON-1865798359 tempest-ServersNegativeTestJSON-1865798359-project-member] Lock "a72e9d21-b1f4-4e61-8875-c72d7ce88a32" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.224s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:05 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Acquiring lock "6a5593cd-3ab6-4859-b0fb-33a0ed702dc8" by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:05 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Lock "6a5593cd-3ab6-4859-b0fb-33a0ed702dc8" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:05 user nova-compute[70374]: DEBUG nova.objects.instance [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Lazy-loading 'flavor' on Instance uuid 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:32:05 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Lock "6a5593cd-3ab6-4859-b0fb-33a0ed702dc8" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: held 0.090s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:06 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Acquiring lock "6a5593cd-3ab6-4859-b0fb-33a0ed702dc8" by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:06 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Lock "6a5593cd-3ab6-4859-b0fb-33a0ed702dc8" acquired by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:06 user nova-compute[70374]: INFO nova.compute.manager [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Attaching volume 4aac502f-2b4f-4e7f-83c0-30e91f701ca5 to /dev/vdb Jul 27 09:32:06 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:06 user nova-compute[70374]: DEBUG os_brick.utils [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '10.0.0.210', 'multipath': False, 'enforce_multipath': True, 'host': 'user', 'execute': None}" {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:175}} Jul 27 09:32:06 user nova-compute[70374]: INFO oslo.privsep.daemon [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova-cpu.conf', '--privsep_context', 'os_brick.privileged.default', '--privsep_sock_path', '/tmp/tmplxfwilyz/privsep.sock'] Jul 27 09:32:06 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:06 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:06 user sudo[82117]: stack : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/nova-rootwrap /etc/nova/rootwrap.conf privsep-helper --config-file /etc/nova/nova-cpu.conf --privsep_context os_brick.privileged.default --privsep_sock_path /tmp/tmplxfwilyz/privsep.sock Jul 27 09:32:06 user sudo[82117]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1001) Jul 27 09:32:06 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:06 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:06 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6ae93f34-ce7e-4ae4-a5ba-36508361bd54/disk.rescue --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:06 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:06 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6ae93f34-ce7e-4ae4-a5ba-36508361bd54/disk.rescue --force-share --output=json" returned: 0 in 0.179s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:06 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6ae93f34-ce7e-4ae4-a5ba-36508361bd54/disk.rescue --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:07 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6ae93f34-ce7e-4ae4-a5ba-36508361bd54/disk.rescue --force-share --output=json" returned: 0 in 0.148s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:07 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6ae93f34-ce7e-4ae4-a5ba-36508361bd54/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:07 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6ae93f34-ce7e-4ae4-a5ba-36508361bd54/disk --force-share --output=json" returned: 0 in 0.161s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:07 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6ae93f34-ce7e-4ae4-a5ba-36508361bd54/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:07 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6ae93f34-ce7e-4ae4-a5ba-36508361bd54/disk --force-share --output=json" returned: 0 in 0.192s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:07 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d35fe056-8279-479a-a673-6c61e5ec6933/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:07 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d35fe056-8279-479a-a673-6c61e5ec6933/disk --force-share --output=json" returned: 0 in 0.156s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:07 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d35fe056-8279-479a-a673-6c61e5ec6933/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:07 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d35fe056-8279-479a-a673-6c61e5ec6933/disk --force-share --output=json" returned: 0 in 0.160s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:07 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ea21f129-1e0a-4ad1-b516-c2a8f92835b7/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:07 user nova-compute[70374]: DEBUG nova.compute.manager [req-6c84a190-c75d-4d6f-9a75-97ab203f8c8f req-4987108d-732b-476d-aea5-ab0367301c71 service nova] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Received event network-changed-e6dc311e-c8d0-4b0c-bf42-96919f9d6f02 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:32:07 user nova-compute[70374]: DEBUG nova.compute.manager [req-6c84a190-c75d-4d6f-9a75-97ab203f8c8f req-4987108d-732b-476d-aea5-ab0367301c71 service nova] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Refreshing instance network info cache due to event network-changed-e6dc311e-c8d0-4b0c-bf42-96919f9d6f02. {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Jul 27 09:32:07 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-6c84a190-c75d-4d6f-9a75-97ab203f8c8f req-4987108d-732b-476d-aea5-ab0367301c71 service nova] Acquiring lock "refresh_cache-8640c525-e6ba-4bf8-9fe0-2c08155dd1cb" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:32:07 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-6c84a190-c75d-4d6f-9a75-97ab203f8c8f req-4987108d-732b-476d-aea5-ab0367301c71 service nova] Acquired lock "refresh_cache-8640c525-e6ba-4bf8-9fe0-2c08155dd1cb" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:32:07 user nova-compute[70374]: DEBUG nova.network.neutron [req-6c84a190-c75d-4d6f-9a75-97ab203f8c8f req-4987108d-732b-476d-aea5-ab0367301c71 service nova] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Refreshing network info cache for port e6dc311e-c8d0-4b0c-bf42-96919f9d6f02 {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Jul 27 09:32:07 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ea21f129-1e0a-4ad1-b516-c2a8f92835b7/disk --force-share --output=json" returned: 0 in 0.229s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:07 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ea21f129-1e0a-4ad1-b516-c2a8f92835b7/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG nova.compute.manager [req-fcf0ac23-63e7-41d3-842b-8bd39c33a046 req-27de16ff-f12d-49f7-a485-c7058018ff61 service nova] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Received event network-vif-plugged-ce2809b5-cc30-4a29-9770-7df186b16406 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-fcf0ac23-63e7-41d3-842b-8bd39c33a046 req-27de16ff-f12d-49f7-a485-c7058018ff61 service nova] Acquiring lock "6ae93f34-ce7e-4ae4-a5ba-36508361bd54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-fcf0ac23-63e7-41d3-842b-8bd39c33a046 req-27de16ff-f12d-49f7-a485-c7058018ff61 service nova] Lock "6ae93f34-ce7e-4ae4-a5ba-36508361bd54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-fcf0ac23-63e7-41d3-842b-8bd39c33a046 req-27de16ff-f12d-49f7-a485-c7058018ff61 service nova] Lock "6ae93f34-ce7e-4ae4-a5ba-36508361bd54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG nova.compute.manager [req-fcf0ac23-63e7-41d3-842b-8bd39c33a046 req-27de16ff-f12d-49f7-a485-c7058018ff61 service nova] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] No waiting events found dispatching network-vif-plugged-ce2809b5-cc30-4a29-9770-7df186b16406 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:32:08 user nova-compute[70374]: WARNING nova.compute.manager [req-fcf0ac23-63e7-41d3-842b-8bd39c33a046 req-27de16ff-f12d-49f7-a485-c7058018ff61 service nova] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Received unexpected event network-vif-plugged-ce2809b5-cc30-4a29-9770-7df186b16406 for instance with vm_state active and task_state rescuing. Jul 27 09:32:08 user nova-compute[70374]: DEBUG nova.compute.manager [req-fcf0ac23-63e7-41d3-842b-8bd39c33a046 req-27de16ff-f12d-49f7-a485-c7058018ff61 service nova] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Received event network-vif-plugged-ce2809b5-cc30-4a29-9770-7df186b16406 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-fcf0ac23-63e7-41d3-842b-8bd39c33a046 req-27de16ff-f12d-49f7-a485-c7058018ff61 service nova] Acquiring lock "6ae93f34-ce7e-4ae4-a5ba-36508361bd54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-fcf0ac23-63e7-41d3-842b-8bd39c33a046 req-27de16ff-f12d-49f7-a485-c7058018ff61 service nova] Lock "6ae93f34-ce7e-4ae4-a5ba-36508361bd54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-fcf0ac23-63e7-41d3-842b-8bd39c33a046 req-27de16ff-f12d-49f7-a485-c7058018ff61 service nova] Lock "6ae93f34-ce7e-4ae4-a5ba-36508361bd54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG nova.compute.manager [req-fcf0ac23-63e7-41d3-842b-8bd39c33a046 req-27de16ff-f12d-49f7-a485-c7058018ff61 service nova] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] No waiting events found dispatching network-vif-plugged-ce2809b5-cc30-4a29-9770-7df186b16406 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:32:08 user nova-compute[70374]: WARNING nova.compute.manager [req-fcf0ac23-63e7-41d3-842b-8bd39c33a046 req-27de16ff-f12d-49f7-a485-c7058018ff61 service nova] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Received unexpected event network-vif-plugged-ce2809b5-cc30-4a29-9770-7df186b16406 for instance with vm_state active and task_state rescuing. Jul 27 09:32:08 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ea21f129-1e0a-4ad1-b516-c2a8f92835b7/disk --force-share --output=json" returned: 0 in 0.161s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:08 user sudo[82117]: pam_unix(sudo:session): session closed for user root Jul 27 09:32:08 user nova-compute[70374]: INFO oslo.privsep.daemon [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Spawned new privsep daemon via rootwrap Jul 27 09:32:08 user nova-compute[70374]: INFO oslo.privsep.daemon [-] privsep daemon starting Jul 27 09:32:08 user nova-compute[70374]: INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0 Jul 27 09:32:08 user nova-compute[70374]: INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none Jul 27 09:32:08 user nova-compute[70374]: INFO oslo.privsep.daemon [-] privsep daemon running as pid 82164 Jul 27 09:32:08 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[75c828de-274f-42f4-bed1-251cf69127ae]: (2,) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/04a990b9-ed32-41e9-b384-f0886e8d1b49/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/04a990b9-ed32-41e9-b384-f0886e8d1b49/disk --force-share --output=json" returned: 0 in 0.157s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/04a990b9-ed32-41e9-b384-f0886e8d1b49/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.016s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[755a30e7-455b-44bd-bca9-079c4da80a4b]: (4, ('InitiatorName=iqn.2016-04.com.open-iscsi:ce3dd372cc44\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:08 user nova-compute[70374]: WARNING os_brick.initiator.connectors.nvmeof [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Could not find nvme_core/parameters/multipath: FileNotFoundError: [Errno 2] No such file or directory: '/sys/module/nvme_core/parameters/multipath' Jul 27 09:32:08 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt / -n -o SOURCE {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "findmnt / -n -o SOURCE" returned: 0 in 0.022s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[766a28bf-bff9-4a49-a61c-81611a75111b]: (4, ('/dev/mapper/vg0-lv--0\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): blkid /dev/mapper/vg0-lv--0 -s UUID -o value {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "blkid /dev/mapper/vg0-lv--0 -s UUID -o value" returned: 0 in 0.019s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[e7a56d33-2b2f-434e-a579-d46e9c30fd01]: (4, ('e95b3b51-542d-42ca-ac40-f83360608668\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[a9c08481-ba51-4714-a540-2c4feada2898]: (4, 'e20c3142-5af9-7467-ecd8-70b2e4a210d6') {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Running cmd (subprocess): nvme version {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] 'nvme version' failed. Not Retrying. {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:473}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.nvmeof [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] nvme not present on system {{(pid=70374) nvme_present /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/nvmeof.py:757}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): nvme show-hostnqn {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/04a990b9-ed32-41e9-b384-f0886e8d1b49/disk --force-share --output=json" returned: 0 in 0.210s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] 'nvme show-hostnqn' failed. Not Retrying. {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:473}} Jul 27 09:32:08 user nova-compute[70374]: WARNING os_brick.privileged.nvmeof [-] Could not generate host nqn: [Errno 2] No such file or directory: 'nvme' Jul 27 09:32:08 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[394179ce-7af2-4529-a03d-09abbb56d5f2]: (4, '') {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.lightos [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] LIGHTOS: [Errno 111] ECONNREFUSED {{(pid=70374) find_dsc /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/lightos.py:98}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.lightos [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] LIGHTOS: did not find dsc, continuing anyway. {{(pid=70374) get_connector_properties /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/lightos.py:76}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.lightos [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] LIGHTOS: no hostnqn found. {{(pid=70374) get_connector_properties /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/lightos.py:84}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG os_brick.utils [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] <== get_connector_properties: return (2297ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '10.0.0.210', 'host': 'user', 'multipath': False, 'initiator': 'iqn.2016-04.com.open-iscsi:ce3dd372cc44', 'do_local_attach': False, 'uuid': 'e95b3b51-542d-42ca-ac40-f83360608668', 'system uuid': 'e20c3142-5af9-7467-ecd8-70b2e4a210d6', 'nvme_native_multipath': False} {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:202}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG nova.virt.block_device [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Updating existing volume attachment record: cd4f606e-45ca-4312-a29e-c9e2cffc972d {{(pid=70374) _volume_attach /opt/stack/nova/nova/virt/block_device.py:631}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fb5ccac9-1e45-4726-b681-cf34cf3fa521/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fb5ccac9-1e45-4726-b681-cf34cf3fa521/disk --force-share --output=json" returned: 0 in 0.183s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fb5ccac9-1e45-4726-b681-cf34cf3fa521/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG nova.network.neutron [req-6c84a190-c75d-4d6f-9a75-97ab203f8c8f req-4987108d-732b-476d-aea5-ab0367301c71 service nova] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Updated VIF entry in instance network info cache for port e6dc311e-c8d0-4b0c-bf42-96919f9d6f02. {{(pid=70374) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG nova.network.neutron [req-6c84a190-c75d-4d6f-9a75-97ab203f8c8f req-4987108d-732b-476d-aea5-ab0367301c71 service nova] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Updating instance_info_cache with network_info: [{"id": "e6dc311e-c8d0-4b0c-bf42-96919f9d6f02", "address": "fa:16:3e:21:44:72", "network": {"id": "fb617df7-46b2-48ba-beb1-29eefa43aa47", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-96492016-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bfaa69b3745a435795aa636ccddec3af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape6dc311e-c8", "ovs_interfaceid": "e6dc311e-c8d0-4b0c-bf42-96919f9d6f02", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-6c84a190-c75d-4d6f-9a75-97ab203f8c8f req-4987108d-732b-476d-aea5-ab0367301c71 service nova] Releasing lock "refresh_cache-8640c525-e6ba-4bf8-9fe0-2c08155dd1cb" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Removed pending event for 6ae93f34-ce7e-4ae4-a5ba-36508361bd54 due to event {{(pid=70374) _event_emit_delayed /opt/stack/nova/nova/virt/libvirt/host.py:438}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG nova.virt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Emitting event Resumed> {{(pid=70374) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Jul 27 09:32:08 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] VM Resumed (Lifecycle Event) Jul 27 09:32:08 user nova-compute[70374]: DEBUG nova.compute.manager [None req-0f5e32e3-00b9-4455-b913-8f4c7e73be2e tempest-ServerRescueNegativeTestJSON-241559119 tempest-ServerRescueNegativeTestJSON-241559119-project-member] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fb5ccac9-1e45-4726-b681-cf34cf3fa521/disk --force-share --output=json" returned: 0 in 0.168s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a72e9d21-b1f4-4e61-8875-c72d7ce88a32/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:32:08 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 {{(pid=70374) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Jul 27 09:32:09 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] During sync_power_state the instance has a pending task (rescuing). Skip. Jul 27 09:32:09 user nova-compute[70374]: DEBUG nova.virt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Emitting event Started> {{(pid=70374) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Jul 27 09:32:09 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] VM Started (Lifecycle Event) Jul 27 09:32:09 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:32:09 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:09 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 6ae93f34-ce7e-4ae4-a5ba-36508361bd54] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 {{(pid=70374) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Jul 27 09:32:09 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a72e9d21-b1f4-4e61-8875-c72d7ce88a32/disk --force-share --output=json" returned: 0 in 0.164s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:09 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a72e9d21-b1f4-4e61-8875-c72d7ce88a32/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:09 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a72e9d21-b1f4-4e61-8875-c72d7ce88a32/disk --force-share --output=json" returned: 0 in 0.168s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:09 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a5593cd-3ab6-4859-b0fb-33a0ed702dc8/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:09 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a5593cd-3ab6-4859-b0fb-33a0ed702dc8/disk --force-share --output=json" returned: 0 in 0.181s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:09 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a5593cd-3ab6-4859-b0fb-33a0ed702dc8/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:09 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6a5593cd-3ab6-4859-b0fb-33a0ed702dc8/disk --force-share --output=json" returned: 0 in 0.192s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:09 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/42f4c546-47e4-485b-be29-4081c7557bad/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:09 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/42f4c546-47e4-485b-be29-4081c7557bad/disk --force-share --output=json" returned: 0 in 0.155s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:09 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/42f4c546-47e4-485b-be29-4081c7557bad/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:09 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/42f4c546-47e4-485b-be29-4081c7557bad/disk --force-share --output=json" returned: 0 in 0.148s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:10 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/25214e8a-c626-46a7-b273-eb491c2fc91b/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:10 user nova-compute[70374]: DEBUG nova.compute.manager [req-eba497ad-998a-4e29-9387-6a7ae29481f4 req-95f442ba-3c88-4846-91c1-81932966edad service nova] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Received event network-changed-b825d9b4-15c4-4b47-a3f7-af9838d09458 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:32:10 user nova-compute[70374]: DEBUG nova.compute.manager [req-eba497ad-998a-4e29-9387-6a7ae29481f4 req-95f442ba-3c88-4846-91c1-81932966edad service nova] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Refreshing instance network info cache due to event network-changed-b825d9b4-15c4-4b47-a3f7-af9838d09458. {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Jul 27 09:32:10 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-eba497ad-998a-4e29-9387-6a7ae29481f4 req-95f442ba-3c88-4846-91c1-81932966edad service nova] Acquiring lock "refresh_cache-25214e8a-c626-46a7-b273-eb491c2fc91b" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:32:10 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-eba497ad-998a-4e29-9387-6a7ae29481f4 req-95f442ba-3c88-4846-91c1-81932966edad service nova] Acquired lock "refresh_cache-25214e8a-c626-46a7-b273-eb491c2fc91b" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:32:10 user nova-compute[70374]: DEBUG nova.network.neutron [req-eba497ad-998a-4e29-9387-6a7ae29481f4 req-95f442ba-3c88-4846-91c1-81932966edad service nova] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Refreshing network info cache for port b825d9b4-15c4-4b47-a3f7-af9838d09458 {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Jul 27 09:32:10 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/25214e8a-c626-46a7-b273-eb491c2fc91b/disk --force-share --output=json" returned: 0 in 0.180s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:10 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/25214e8a-c626-46a7-b273-eb491c2fc91b/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:10 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/25214e8a-c626-46a7-b273-eb491c2fc91b/disk --force-share --output=json" returned: 0 in 0.148s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:10 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8640c525-e6ba-4bf8-9fe0-2c08155dd1cb/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:10 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8640c525-e6ba-4bf8-9fe0-2c08155dd1cb/disk --force-share --output=json" returned: 0 in 0.180s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:10 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8640c525-e6ba-4bf8-9fe0-2c08155dd1cb/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:10 user nova-compute[70374]: DEBUG nova.network.neutron [req-eba497ad-998a-4e29-9387-6a7ae29481f4 req-95f442ba-3c88-4846-91c1-81932966edad service nova] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Updated VIF entry in instance network info cache for port b825d9b4-15c4-4b47-a3f7-af9838d09458. {{(pid=70374) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Jul 27 09:32:10 user nova-compute[70374]: DEBUG nova.network.neutron [req-eba497ad-998a-4e29-9387-6a7ae29481f4 req-95f442ba-3c88-4846-91c1-81932966edad service nova] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Updating instance_info_cache with network_info: [{"id": "b825d9b4-15c4-4b47-a3f7-af9838d09458", "address": "fa:16:3e:b2:af:a6", "network": {"id": "b51b0e60-de31-47a5-908f-36f76e9fa620", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-754390872-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "0592f0be670742a181e24823955f378b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb825d9b4-15", "ovs_interfaceid": "b825d9b4-15c4-4b47-a3f7-af9838d09458", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:32:10 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-eba497ad-998a-4e29-9387-6a7ae29481f4 req-95f442ba-3c88-4846-91c1-81932966edad service nova] Releasing lock "refresh_cache-25214e8a-c626-46a7-b273-eb491c2fc91b" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:32:10 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/8640c525-e6ba-4bf8-9fe0-2c08155dd1cb/disk --force-share --output=json" returned: 0 in 0.238s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:10 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/309c9c26-4a0f-45db-bb3a-595b19f3f627/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:10 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/309c9c26-4a0f-45db-bb3a-595b19f3f627/disk --force-share --output=json" returned: 0 in 0.150s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:10 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/309c9c26-4a0f-45db-bb3a-595b19f3f627/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:11 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/309c9c26-4a0f-45db-bb3a-595b19f3f627/disk --force-share --output=json" returned: 0 in 0.165s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:11 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:11 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06/disk --force-share --output=json" returned: 0 in 0.149s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:11 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:11 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06/disk --force-share --output=json" returned: 0 in 0.151s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:11 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Acquiring lock "8640c525-e6ba-4bf8-9fe0-2c08155dd1cb" by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:11 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Lock "8640c525-e6ba-4bf8-9fe0-2c08155dd1cb" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG nova.objects.instance [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Lazy-loading 'flavor' on Instance uuid 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:32:12 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:32:12 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:32:12 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Hypervisor/Node resource view: name=user free_ram=6811MB free_disk=25.62173080444336GB free_vcpus=0 pci_devices=[{"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=70374) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1080}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Lock "8640c525-e6ba-4bf8-9fe0-2c08155dd1cb" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: held 0.698s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Instance 04a990b9-ed32-41e9-b384-f0886e8d1b49 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 512, 'VCPU': 1}}. {{(pid=70374) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1678}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Instance fb5ccac9-1e45-4726-b681-cf34cf3fa521 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 512, 'VCPU': 1}}. {{(pid=70374) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1678}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Instance 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 512, 'VCPU': 1}}. {{(pid=70374) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1678}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Instance 6ae93f34-ce7e-4ae4-a5ba-36508361bd54 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 512, 'VCPU': 1}}. {{(pid=70374) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1678}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Instance 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 512, 'VCPU': 1}}. {{(pid=70374) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1678}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Instance 25214e8a-c626-46a7-b273-eb491c2fc91b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 512, 'VCPU': 1}}. {{(pid=70374) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1678}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Instance 42f4c546-47e4-485b-be29-4081c7557bad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 512, 'VCPU': 1}}. {{(pid=70374) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1678}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Instance 309c9c26-4a0f-45db-bb3a-595b19f3f627 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 512, 'VCPU': 1}}. {{(pid=70374) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1678}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Instance d35fe056-8279-479a-a673-6c61e5ec6933 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 512, 'VCPU': 1}}. {{(pid=70374) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1678}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Instance 773917c6-56d7-4491-a760-05f51593b7f0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 512, 'VCPU': 1}}. {{(pid=70374) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1678}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Instance b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 512, 'VCPU': 1}}. {{(pid=70374) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1678}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Instance ea21f129-1e0a-4ad1-b516-c2a8f92835b7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 512, 'VCPU': 1}}. {{(pid=70374) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1678}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Instance a72e9d21-b1f4-4e61-8875-c72d7ce88a32 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 512, 'VCPU': 1}}. {{(pid=70374) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1678}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Total usable vcpus: 12, total allocated vcpus: 13 {{(pid=70374) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1103}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Final resource view: name=user phys_ram=16011MB used_ram=7168MB phys_disk=40GB used_disk=13GB total_vcpus=12 used_vcpus=13 pci_stats=[] {{(pid=70374) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1112}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Acquiring lock "8640c525-e6ba-4bf8-9fe0-2c08155dd1cb" by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Lock "8640c525-e6ba-4bf8-9fe0-2c08155dd1cb" acquired by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:12 user nova-compute[70374]: INFO nova.compute.manager [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Attaching volume ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 to /dev/vdb Jul 27 09:32:12 user nova-compute[70374]: DEBUG os_brick.utils [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '10.0.0.210', 'multipath': False, 'enforce_multipath': True, 'host': 'user', 'execute': None}" {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:175}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.014s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[e2823574-8d30-4b50-a576-6274fc158aa5]: (4, ('InitiatorName=iqn.2016-04.com.open-iscsi:ce3dd372cc44\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt / -n -o SOURCE {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "findmnt / -n -o SOURCE" returned: 0 in 0.019s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[1b96a026-65c3-411d-b265-41e43cc57351]: (4, ('/dev/mapper/vg0-lv--0\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): blkid /dev/mapper/vg0-lv--0 -s UUID -o value {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "blkid /dev/mapper/vg0-lv--0 -s UUID -o value" returned: 0 in 0.017s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[f3456f0b-a6c9-49e7-89bf-d50b72d8a978]: (4, ('e95b3b51-542d-42ca-ac40-f83360608668\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[0f225bf8-46d5-4d3b-8e47-d06f5d4e931e]: (4, 'e20c3142-5af9-7467-ecd8-70b2e4a210d6') {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Running cmd (subprocess): nvme version {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] 'nvme version' failed. Not Retrying. {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:473}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.nvmeof [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] nvme not present on system {{(pid=70374) nvme_present /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/nvmeof.py:757}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): nvme show-hostnqn {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] 'nvme show-hostnqn' failed. Not Retrying. {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:473}} Jul 27 09:32:12 user nova-compute[70374]: WARNING os_brick.privileged.nvmeof [-] Could not generate host nqn: [Errno 2] No such file or directory: 'nvme' Jul 27 09:32:12 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[3d049177-c907-4a4f-9223-486a5a38f3e0]: (4, '') {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.lightos [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] LIGHTOS: [Errno 111] ECONNREFUSED {{(pid=70374) find_dsc /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/lightos.py:98}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.lightos [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] LIGHTOS: did not find dsc, continuing anyway. {{(pid=70374) get_connector_properties /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/lightos.py:76}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.lightos [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] LIGHTOS: no hostnqn found. {{(pid=70374) get_connector_properties /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/lightos.py:84}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG os_brick.utils [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] <== get_connector_properties: return (143ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '10.0.0.210', 'host': 'user', 'multipath': False, 'initiator': 'iqn.2016-04.com.open-iscsi:ce3dd372cc44', 'do_local_attach': False, 'uuid': 'e95b3b51-542d-42ca-ac40-f83360608668', 'system uuid': 'e20c3142-5af9-7467-ecd8-70b2e4a210d6', 'nvme_native_multipath': False} {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:202}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG nova.virt.block_device [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Updating existing volume attachment record: 2be6bf71-4cb0-402d-a975-55e5458524e9 {{(pid=70374) _volume_attach /opt/stack/nova/nova/virt/block_device.py:631}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG nova.compute.provider_tree [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Inventory has not changed in ProviderTree for provider: f7548644-4a09-4ad8-9aa6-6e05d85a9f5b {{(pid=70374) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG nova.scheduler.client.report [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Inventory has not changed for provider f7548644-4a09-4ad8-9aa6-6e05d85a9f5b based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16011, 'reserved': 512, 'min_unit': 1, 'max_unit': 16011, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70374) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG nova.compute.resource_tracker [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Compute_service record updated for user:user {{(pid=70374) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1041}} Jul 27 09:32:12 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.797s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:13 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:32:13 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:32:13 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:32:13 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:32:13 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:32:13 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:32:13 user nova-compute[70374]: DEBUG oslo_service.periodic_task [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=70374) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Jul 27 09:32:13 user nova-compute[70374]: DEBUG nova.compute.manager [None req-de897f0f-b4d9-4176-abc0-9f316f88df4d None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=70374) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10461}} Jul 27 09:32:14 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:14 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Acquiring lock "25214e8a-c626-46a7-b273-eb491c2fc91b" by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:14 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Lock "25214e8a-c626-46a7-b273-eb491c2fc91b" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: waited 0.002s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:14 user nova-compute[70374]: DEBUG nova.objects.instance [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Lazy-loading 'flavor' on Instance uuid 25214e8a-c626-46a7-b273-eb491c2fc91b {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:32:14 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Lock "25214e8a-c626-46a7-b273-eb491c2fc91b" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: held 0.089s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:14 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Acquiring lock "25214e8a-c626-46a7-b273-eb491c2fc91b" by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:14 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Lock "25214e8a-c626-46a7-b273-eb491c2fc91b" acquired by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:14 user nova-compute[70374]: INFO nova.compute.manager [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Attaching volume 788f71f6-0aca-4da9-a915-59560f5cb291 to /dev/vdb Jul 27 09:32:14 user nova-compute[70374]: DEBUG os_brick.utils [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '10.0.0.210', 'multipath': False, 'enforce_multipath': True, 'host': 'user', 'execute': None}" {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:175}} Jul 27 09:32:14 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:14 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.015s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:14 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[f2184cad-4a7c-41e1-bf25-6c21661634ff]: (4, ('InitiatorName=iqn.2016-04.com.open-iscsi:ce3dd372cc44\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:14 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt / -n -o SOURCE {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:14 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "findmnt / -n -o SOURCE" returned: 0 in 0.021s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:14 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[086f6da8-92c4-4619-b042-2a0c8410c865]: (4, ('/dev/mapper/vg0-lv--0\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:14 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): blkid /dev/mapper/vg0-lv--0 -s UUID -o value {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:14 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "blkid /dev/mapper/vg0-lv--0 -s UUID -o value" returned: 0 in 0.022s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:14 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[a0cf04de-3468-431c-a0e1-72bd278a30cd]: (4, ('e95b3b51-542d-42ca-ac40-f83360608668\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:14 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[73c86acf-66d2-4d4c-8478-058f438412c6]: (4, 'e20c3142-5af9-7467-ecd8-70b2e4a210d6') {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:14 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Running cmd (subprocess): nvme version {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:14 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] 'nvme version' failed. Not Retrying. {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:473}} Jul 27 09:32:14 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.nvmeof [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] nvme not present on system {{(pid=70374) nvme_present /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/nvmeof.py:757}} Jul 27 09:32:14 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): nvme show-hostnqn {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:14 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] 'nvme show-hostnqn' failed. Not Retrying. {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:473}} Jul 27 09:32:14 user nova-compute[70374]: WARNING os_brick.privileged.nvmeof [-] Could not generate host nqn: [Errno 2] No such file or directory: 'nvme' Jul 27 09:32:14 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[1722dd70-fccd-4c02-8c13-d9be84b17ef9]: (4, '') {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:14 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.lightos [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] LIGHTOS: [Errno 111] ECONNREFUSED {{(pid=70374) find_dsc /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/lightos.py:98}} Jul 27 09:32:14 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.lightos [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] LIGHTOS: did not find dsc, continuing anyway. {{(pid=70374) get_connector_properties /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/lightos.py:76}} Jul 27 09:32:14 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.lightos [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] LIGHTOS: no hostnqn found. {{(pid=70374) get_connector_properties /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/lightos.py:84}} Jul 27 09:32:14 user nova-compute[70374]: DEBUG os_brick.utils [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] <== get_connector_properties: return (135ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '10.0.0.210', 'host': 'user', 'multipath': False, 'initiator': 'iqn.2016-04.com.open-iscsi:ce3dd372cc44', 'do_local_attach': False, 'uuid': 'e95b3b51-542d-42ca-ac40-f83360608668', 'system uuid': 'e20c3142-5af9-7467-ecd8-70b2e4a210d6', 'nvme_native_multipath': False} {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:202}} Jul 27 09:32:14 user nova-compute[70374]: DEBUG nova.virt.block_device [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Updating existing volume attachment record: 3329de24-8de5-4111-84db-b9a71d949e64 {{(pid=70374) _volume_attach /opt/stack/nova/nova/virt/block_device.py:631}} Jul 27 09:32:17 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:17 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Acquiring lock "cache_volume_driver" by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.._cache_volume_driver" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:17 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Lock "cache_volume_driver" acquired by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.._cache_volume_driver" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:17 user nova-compute[70374]: DEBUG os_brick.initiator.connector [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Factory for ISCSI on None {{(pid=70374) factory /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connector.py:281}} Jul 27 09:32:17 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Lock "cache_volume_driver" "released" by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.._cache_volume_driver" :: held 0.006s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:17 user nova-compute[70374]: DEBUG nova.virt.libvirt.volume.iscsi [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Calling os-brick to attach iSCSI Volume {{(pid=70374) connect_volume /opt/stack/nova/nova/virt/libvirt/volume/iscsi.py:63}} Jul 27 09:32:17 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] ==> connect_volume: call "{'self': , 'connection_properties': {'target_iqn': 'iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5', 'target_portal': '172.16.0.220:3260', 'target_discovered': False, 'auth_method': 'CHAP', 'auth_username': 'huTvDrlQ', 'auth_password': '***', 'target_lun': 0, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False}}" {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:175}} Jul 27 09:32:17 user nova-compute[70374]: WARNING os_brick.initiator.connectors.base [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Service needs to call os_brick.setup() before connecting volumes, if it doesn't it will break on the next release Jul 27 09:32:17 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Acquiring lock "connect_volume" by "os_brick.initiator.connectors.iscsi.ISCSIConnector.connect_volume" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:68}} Jul 27 09:32:17 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Lock "connect_volume" acquired by "os_brick.initiator.connectors.iscsi.ISCSIConnector.connect_volume" :: waited 0.002s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:73}} Jul 27 09:32:17 user nova-compute[70374]: WARNING os_brick.initiator.connectors.base [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Service needs to call os_brick.setup() before connecting volumes, if it doesn't it will break on the next release Jul 27 09:32:17 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Acquiring lock "connect_to_iscsi_portal-172.16.0.220:3260-iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5" by "os_brick.initiator.connectors.iscsi.ISCSIConnector._connect_to_iscsi_portal_unsafe" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:68}} Jul 27 09:32:17 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Lock "connect_to_iscsi_portal-172.16.0.220:3260-iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5" acquired by "os_brick.initiator.connectors.iscsi.ISCSIConnector._connect_to_iscsi_portal_unsafe" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:73}} Jul 27 09:32:17 user nova-compute[70374]: INFO os_brick.initiator.connectors.iscsi [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Trying to connect to iSCSI portal 172.16.0.220:3260 Jul 27 09:32:17 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5 -p 172.16.0.220:3260 {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:17 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5 -p 172.16.0.220:3260" returned: 21 in 0.017s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:17 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[2888aadf-faad-46b6-9e69-5bdaee9ea622]: (4, ('', 'iscsiadm: No records found\n')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:17 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] iscsiadm (): stdout= stderr=iscsiadm: No records found Jul 27 09:32:17 user nova-compute[70374]: {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:17 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5 -p 172.16.0.220:3260 --interface default --op new {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:17 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5 -p 172.16.0.220:3260 --interface default --op new" returned: 0 in 0.018s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:17 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[63bc23bc-d8e6-4f38-8b26-9e49e77e55e0]: (4, ('New iSCSI node [tcp:[hw=,ip=,net_if=,iscsi_if=default] 172.16.0.220,3260,-1 iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5] added\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:17 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] iscsiadm ('--interface', 'default', '--op', 'new'): stdout=New iSCSI node [tcp:[hw=,ip=,net_if=,iscsi_if=default] 172.16.0.220,3260,-1 iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5] added Jul 27 09:32:17 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:17 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5 -p 172.16.0.220:3260 --op update -n node.session.scan -v manual {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:17 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5 -p 172.16.0.220:3260 --op update -n node.session.scan -v manual" returned: 0 in 0.021s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:17 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[7731a08b-7de3-40d7-aa07-bc4ea0bf401c]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:17 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] iscsiadm ('--op', 'update', '-n', 'node.session.scan', '-v', 'manual'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:17 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5 -p 172.16.0.220:3260 --op update -n node.session.auth.authmethod -v CHAP {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:17 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5 -p 172.16.0.220:3260 --op update -n node.session.auth.authmethod -v CHAP" returned: 0 in 0.020s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:17 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[70368c6e-00c9-45bd-bd0c-c71dd04bd7ff]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:17 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] iscsiadm ('--op', 'update', '-n', 'node.session.auth.authmethod', '-v', 'CHAP'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:17 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5 -p 172.16.0.220:3260 --op update -n node.session.auth.username -v huTvDrlQ {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:17 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5 -p 172.16.0.220:3260 --op update -n node.session.auth.username -v huTvDrlQ" returned: 0 in 0.017s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:17 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[5bb24055-4d7a-4b78-84c8-bee6511442cd]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:17 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] iscsiadm ('--op', 'update', '-n', 'node.session.auth.username', '-v', 'huTvDrlQ'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:17 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5 -p 172.16.0.220:3260 --op update -n node.session.auth.password -v *** {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:18 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5 -p 172.16.0.220:3260 --op update -n node.session.auth.password -v ***" returned: 0 in 0.017s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:18 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[3631ae05-f0fc-4e46-84cd-8a92f5c926c4]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:18 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] iscsiadm ('--op', 'update', '-n', 'node.session.auth.password', '-v', '***'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:18 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m session {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:18 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m session" returned: 21 in 0.016s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:18 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[a4f12b8f-8f7b-4077-b5c5-c991eb243c39]: (4, ('', 'iscsiadm: No active sessions.\n')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:18 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] iscsiadm ('-m', 'session'): stdout= stderr=iscsiadm: No active sessions. Jul 27 09:32:18 user nova-compute[70374]: {{(pid=70374) _run_iscsiadm_bare /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1182}} Jul 27 09:32:18 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] iscsi session list stdout= stderr=iscsiadm: No active sessions. Jul 27 09:32:18 user nova-compute[70374]: {{(pid=70374) _run_iscsi_session /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1171}} Jul 27 09:32:18 user nova-compute[70374]: WARNING os_brick.initiator.connectors.iscsi [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] iscsiadm stderr output when getting sessions: iscsiadm: No active sessions. Jul 27 09:32:18 user nova-compute[70374]: Jul 27 09:32:18 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5 -p 172.16.0.220:3260 --login {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:18 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5 -p 172.16.0.220:3260 --login" returned: 0 in 0.054s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:18 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[4456e8e2-121b-49c9-a0b3-60f7f6438a08]: (4, ('Logging in to [iface: default, target: iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5, portal: 172.16.0.220,3260]\nLogin to [iface: default, target: iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5, portal: 172.16.0.220,3260] successful.\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:18 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] iscsiadm ('--login',): stdout=Logging in to [iface: default, target: iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5, portal: 172.16.0.220,3260] Jul 27 09:32:18 user nova-compute[70374]: Login to [iface: default, target: iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5, portal: 172.16.0.220,3260] successful. Jul 27 09:32:18 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:18 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5 -p 172.16.0.220:3260 --op update -n node.startup -v automatic {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:18 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5 -p 172.16.0.220:3260 --op update -n node.startup -v automatic" returned: 0 in 0.024s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:18 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[1c4674df-b4f0-4abb-b6b6-520cda4040ea]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:18 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] iscsiadm ('--op', 'update', '-n', 'node.startup', '-v', 'automatic'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:18 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m session {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:18 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m session" returned: 0 in 0.024s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:18 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[1b8ca2c5-29e1-49cc-9a25-5a4d6034a7a8]: (4, ('tcp: [2] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5 (non-flash)\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:18 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] iscsiadm ('-m', 'session'): stdout=tcp: [2] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5 (non-flash) Jul 27 09:32:18 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm_bare /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1182}} Jul 27 09:32:18 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] iscsi session list stdout=tcp: [2] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5 (non-flash) Jul 27 09:32:18 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsi_session /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1171}} Jul 27 09:32:18 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Lock "connect_to_iscsi_portal-172.16.0.220:3260-iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5" "released" by "os_brick.initiator.connectors.iscsi.ISCSIConnector._connect_to_iscsi_portal_unsafe" :: held 0.271s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:87}} Jul 27 09:32:18 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Connected to 172.16.0.220:3260 {{(pid=70374) _connect_vol /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:633}} Jul 27 09:32:18 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] HCTL ('33', '-', '-', 0) found on session 2 with lun 0 {{(pid=70374) get_hctl /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:694}} Jul 27 09:32:18 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Scanning host 33 c: -, t: -, l: 0) {{(pid=70374) scan_iscsi /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:719}} Jul 27 09:32:18 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): tee -a /sys/class/scsi_host/host33/scan {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:18 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "tee -a /sys/class/scsi_host/host33/scan" returned: 0 in 0.024s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:18 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[271c9ef4-a29f-46bd-b20e-f229f3acc48f]: (4, ('- - 0', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:18 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Searching for a device in session 2 and hctl ['33', '*', '*', 0] yield: sdb {{(pid=70374) device_name_by_hctl /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:713}} Jul 27 09:32:18 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Connected to sdb using {'target_iqn': 'iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5', 'target_portal': '172.16.0.220:3260', 'target_discovered': False, 'auth_method': 'CHAP', 'auth_username': 'huTvDrlQ', 'auth_password': '***', 'target_lun': 0, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False} {{(pid=70374) _connect_vol /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:661}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Lock "connect_volume" "released" by "os_brick.initiator.connectors.iscsi.ISCSIConnector.connect_volume" :: held 1.313s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:87}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] <== connect_volume: return (1315ms) {'type': 'block', 'scsi_wwn': '23633653466643234', 'path': '/dev/sdb'} {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:202}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG nova.virt.libvirt.volume.iscsi [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Attached iSCSI volume {'type': 'block', 'scsi_wwn': '23633653466643234', 'path': '/dev/sdb'} {{(pid=70374) connect_volume /opt/stack/nova/nova/virt/libvirt/volume/iscsi.py:65}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG nova.objects.instance [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Lazy-loading 'flavor' on Instance uuid 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG nova.virt.libvirt.guest [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] attach device xml: Jul 27 09:32:19 user nova-compute[70374]: Jul 27 09:32:19 user nova-compute[70374]: Jul 27 09:32:19 user nova-compute[70374]: Jul 27 09:32:19 user nova-compute[70374]: 4aac502f-2b4f-4e7f-83c0-30e91f701ca5 Jul 27 09:32:19 user nova-compute[70374]: Jul 27 09:32:19 user nova-compute[70374]: {{(pid=70374) attach_device /opt/stack/nova/nova/virt/libvirt/guest.py:339}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] No BDM found with device name vda, not building metadata. {{(pid=70374) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] No BDM found with device name vdb, not building metadata. {{(pid=70374) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] No VIF found with MAC fa:16:3e:de:fb:c0, not building metadata {{(pid=70374) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG nova.virt.libvirt.volume.iscsi [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Calling os-brick to attach iSCSI Volume {{(pid=70374) connect_volume /opt/stack/nova/nova/virt/libvirt/volume/iscsi.py:63}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] ==> connect_volume: call "{'self': , 'connection_properties': {'target_iqn': 'iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251', 'target_portal': '172.16.0.220:3260', 'target_discovered': False, 'auth_method': 'CHAP', 'auth_username': 'RWyzhePO', 'auth_password': '***', 'target_lun': 0, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False}}" {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:175}} Jul 27 09:32:19 user nova-compute[70374]: WARNING os_brick.initiator.connectors.base [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Service needs to call os_brick.setup() before connecting volumes, if it doesn't it will break on the next release Jul 27 09:32:19 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Acquiring lock "connect_volume" by "os_brick.initiator.connectors.iscsi.ISCSIConnector.connect_volume" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:68}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Lock "connect_volume" acquired by "os_brick.initiator.connectors.iscsi.ISCSIConnector.connect_volume" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:73}} Jul 27 09:32:19 user nova-compute[70374]: WARNING os_brick.initiator.connectors.base [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Service needs to call os_brick.setup() before connecting volumes, if it doesn't it will break on the next release Jul 27 09:32:19 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Acquiring lock "connect_to_iscsi_portal-172.16.0.220:3260-iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251" by "os_brick.initiator.connectors.iscsi.ISCSIConnector._connect_to_iscsi_portal_unsafe" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:68}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Lock "connect_to_iscsi_portal-172.16.0.220:3260-iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251" acquired by "os_brick.initiator.connectors.iscsi.ISCSIConnector._connect_to_iscsi_portal_unsafe" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:73}} Jul 27 09:32:19 user nova-compute[70374]: INFO os_brick.initiator.connectors.iscsi [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Trying to connect to iSCSI portal 172.16.0.220:3260 Jul 27 09:32:19 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 -p 172.16.0.220:3260 {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 -p 172.16.0.220:3260" returned: 21 in 0.023s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[4ad79c61-19f4-42cf-83f4-f119ae6e7cbf]: (4, ('', 'iscsiadm: No records found\n')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] iscsiadm (): stdout= stderr=iscsiadm: No records found Jul 27 09:32:19 user nova-compute[70374]: {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 -p 172.16.0.220:3260 --interface default --op new {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 -p 172.16.0.220:3260 --interface default --op new" returned: 0 in 0.025s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[141fc557-3a98-4426-a3f0-f6ea1a0fa61e]: (4, ('New iSCSI node [tcp:[hw=,ip=,net_if=,iscsi_if=default] 172.16.0.220,3260,-1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251] added\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] iscsiadm ('--interface', 'default', '--op', 'new'): stdout=New iSCSI node [tcp:[hw=,ip=,net_if=,iscsi_if=default] 172.16.0.220,3260,-1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251] added Jul 27 09:32:19 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 -p 172.16.0.220:3260 --op update -n node.session.scan -v manual {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 -p 172.16.0.220:3260 --op update -n node.session.scan -v manual" returned: 0 in 0.023s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[4dd9bbcc-ccbb-4e48-a2dd-37ee1e8c5640]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] iscsiadm ('--op', 'update', '-n', 'node.session.scan', '-v', 'manual'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 -p 172.16.0.220:3260 --op update -n node.session.auth.authmethod -v CHAP {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 -p 172.16.0.220:3260 --op update -n node.session.auth.authmethod -v CHAP" returned: 0 in 0.025s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[22063849-fd5a-489e-9960-8ad43503e9d2]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] iscsiadm ('--op', 'update', '-n', 'node.session.auth.authmethod', '-v', 'CHAP'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 -p 172.16.0.220:3260 --op update -n node.session.auth.username -v RWyzhePO {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-5f197139-a16a-49e2-9a42-dd4d2757d842 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Lock "6a5593cd-3ab6-4859-b0fb-33a0ed702dc8" "released" by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: held 13.755s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 -p 172.16.0.220:3260 --op update -n node.session.auth.username -v RWyzhePO" returned: 0 in 0.021s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[f3502cd4-9105-4252-8f02-358a01d72a80]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] iscsiadm ('--op', 'update', '-n', 'node.session.auth.username', '-v', 'RWyzhePO'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 -p 172.16.0.220:3260 --op update -n node.session.auth.password -v *** {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 -p 172.16.0.220:3260 --op update -n node.session.auth.password -v ***" returned: 0 in 0.019s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[af2e645a-e599-4c1c-8438-5323bdc4ee77]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] iscsiadm ('--op', 'update', '-n', 'node.session.auth.password', '-v', '***'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m session {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m session" returned: 0 in 0.022s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[f991d653-052c-4a64-ba60-0a594ac0b723]: (4, ('tcp: [2] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5 (non-flash)\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] iscsiadm ('-m', 'session'): stdout=tcp: [2] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5 (non-flash) Jul 27 09:32:19 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm_bare /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1182}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] iscsi session list stdout=tcp: [2] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5 (non-flash) Jul 27 09:32:19 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsi_session /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1171}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 -p 172.16.0.220:3260 --login {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 -p 172.16.0.220:3260 --login" returned: 0 in 0.058s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[0baff6c5-3de8-4138-85f4-2f9bdeff82ac]: (4, ('Logging in to [iface: default, target: iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251, portal: 172.16.0.220,3260]\nLogin to [iface: default, target: iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251, portal: 172.16.0.220,3260] successful.\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:19 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] iscsiadm ('--login',): stdout=Logging in to [iface: default, target: iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251, portal: 172.16.0.220,3260] Jul 27 09:32:19 user nova-compute[70374]: Login to [iface: default, target: iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251, portal: 172.16.0.220,3260] successful. Jul 27 09:32:19 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:20 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 -p 172.16.0.220:3260 --op update -n node.startup -v automatic {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:20 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 -p 172.16.0.220:3260 --op update -n node.startup -v automatic" returned: 0 in 0.021s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:20 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[13ca1a6d-4686-4a8f-9ac9-bd52b40ad305]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:20 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] iscsiadm ('--op', 'update', '-n', 'node.startup', '-v', 'automatic'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:20 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m session {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:20 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m session" returned: 0 in 0.021s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:20 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[fb082599-7ef2-4e3f-8ad7-e6b455358f7b]: (4, ('tcp: [2] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5 (non-flash)\ntcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash)\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:20 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] iscsiadm ('-m', 'session'): stdout=tcp: [2] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5 (non-flash) Jul 27 09:32:20 user nova-compute[70374]: tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash) Jul 27 09:32:20 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm_bare /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1182}} Jul 27 09:32:20 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] iscsi session list stdout=tcp: [2] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5 (non-flash) Jul 27 09:32:20 user nova-compute[70374]: tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash) Jul 27 09:32:20 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsi_session /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1171}} Jul 27 09:32:20 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Lock "connect_to_iscsi_portal-172.16.0.220:3260-iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251" "released" by "os_brick.initiator.connectors.iscsi.ISCSIConnector._connect_to_iscsi_portal_unsafe" :: held 0.301s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:87}} Jul 27 09:32:20 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Connected to 172.16.0.220:3260 {{(pid=70374) _connect_vol /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:633}} Jul 27 09:32:20 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] HCTL ('34', '-', '-', 0) found on session 3 with lun 0 {{(pid=70374) get_hctl /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:694}} Jul 27 09:32:20 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Scanning host 34 c: -, t: -, l: 0) {{(pid=70374) scan_iscsi /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:719}} Jul 27 09:32:20 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): tee -a /sys/class/scsi_host/host34/scan {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:20 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "tee -a /sys/class/scsi_host/host34/scan" returned: 0 in 0.021s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:20 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[5bc1a199-d60b-4a56-9c83-344da921443f]: (4, ('- - 0', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:20 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Searching for a device in session 3 and hctl ['34', '*', '*', 0] yield: sdc {{(pid=70374) device_name_by_hctl /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:713}} Jul 27 09:32:20 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Connected to sdc using {'target_iqn': 'iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251', 'target_portal': '172.16.0.220:3260', 'target_discovered': False, 'auth_method': 'CHAP', 'auth_username': 'RWyzhePO', 'auth_password': '***', 'target_lun': 0, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False} {{(pid=70374) _connect_vol /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:661}} Jul 27 09:32:20 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Acquiring lock "6a5593cd-3ab6-4859-b0fb-33a0ed702dc8" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:20 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Lock "6a5593cd-3ab6-4859-b0fb-33a0ed702dc8" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:20 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Acquiring lock "6a5593cd-3ab6-4859-b0fb-33a0ed702dc8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:20 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Lock "6a5593cd-3ab6-4859-b0fb-33a0ed702dc8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:20 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Lock "6a5593cd-3ab6-4859-b0fb-33a0ed702dc8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:20 user nova-compute[70374]: INFO nova.compute.manager [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Terminating instance Jul 27 09:32:20 user nova-compute[70374]: DEBUG nova.compute.manager [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Start destroying the instance on the hypervisor. {{(pid=70374) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} Jul 27 09:32:20 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:20 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:20 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:20 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:20 user nova-compute[70374]: DEBUG nova.compute.manager [req-1c9498dc-4b52-4893-8c89-b48845d1fb97 req-1e3d3c85-5538-46f0-8a49-897df55ebf3e service nova] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Received event network-vif-unplugged-98a9819c-4270-4b75-b851-635a3b19a7b4 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:32:20 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-1c9498dc-4b52-4893-8c89-b48845d1fb97 req-1e3d3c85-5538-46f0-8a49-897df55ebf3e service nova] Acquiring lock "6a5593cd-3ab6-4859-b0fb-33a0ed702dc8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:20 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-1c9498dc-4b52-4893-8c89-b48845d1fb97 req-1e3d3c85-5538-46f0-8a49-897df55ebf3e service nova] Lock "6a5593cd-3ab6-4859-b0fb-33a0ed702dc8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:20 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-1c9498dc-4b52-4893-8c89-b48845d1fb97 req-1e3d3c85-5538-46f0-8a49-897df55ebf3e service nova] Lock "6a5593cd-3ab6-4859-b0fb-33a0ed702dc8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:20 user nova-compute[70374]: DEBUG nova.compute.manager [req-1c9498dc-4b52-4893-8c89-b48845d1fb97 req-1e3d3c85-5538-46f0-8a49-897df55ebf3e service nova] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] No waiting events found dispatching network-vif-unplugged-98a9819c-4270-4b75-b851-635a3b19a7b4 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:32:20 user nova-compute[70374]: DEBUG nova.compute.manager [req-1c9498dc-4b52-4893-8c89-b48845d1fb97 req-1e3d3c85-5538-46f0-8a49-897df55ebf3e service nova] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Received event network-vif-unplugged-98a9819c-4270-4b75-b851-635a3b19a7b4 for instance with task_state deleting. {{(pid=70374) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10810}} Jul 27 09:32:21 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Lock "connect_volume" "released" by "os_brick.initiator.connectors.iscsi.ISCSIConnector.connect_volume" :: held 1.339s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:87}} Jul 27 09:32:21 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] <== connect_volume: return (1341ms) {'type': 'block', 'scsi_wwn': '26562613965376139', 'path': '/dev/sdc'} {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:202}} Jul 27 09:32:21 user nova-compute[70374]: DEBUG nova.virt.libvirt.volume.iscsi [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb] Attached iSCSI volume {'type': 'block', 'scsi_wwn': '26562613965376139', 'path': '/dev/sdc'} {{(pid=70374) connect_volume /opt/stack/nova/nova/virt/libvirt/volume/iscsi.py:65}} Jul 27 09:32:21 user nova-compute[70374]: DEBUG nova.objects.instance [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Lazy-loading 'flavor' on Instance uuid 8640c525-e6ba-4bf8-9fe0-2c08155dd1cb {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:32:21 user nova-compute[70374]: DEBUG nova.virt.libvirt.guest [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] attach device xml: Jul 27 09:32:21 user nova-compute[70374]: Jul 27 09:32:21 user nova-compute[70374]: Jul 27 09:32:21 user nova-compute[70374]: Jul 27 09:32:21 user nova-compute[70374]: ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 Jul 27 09:32:21 user nova-compute[70374]: Jul 27 09:32:21 user nova-compute[70374]: {{(pid=70374) attach_device /opt/stack/nova/nova/virt/libvirt/guest.py:339}} Jul 27 09:32:21 user nova-compute[70374]: INFO nova.virt.libvirt.driver [-] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Instance destroyed successfully. Jul 27 09:32:21 user nova-compute[70374]: DEBUG nova.objects.instance [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Lazy-loading 'resources' on Instance uuid 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:32:21 user nova-compute[70374]: DEBUG nova.virt.libvirt.vif [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-07-27T09:30:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1737227100',display_name='tempest-DeleteServersTestJSON-server-1737227100',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(6),hidden=False,host='user',hostname='tempest-deleteserverstestjson-server-1737227100',id=3,image_ref='35458adf-261a-4e0b-a4db-b243619b2394',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-07-27T09:30:26Z,launched_on='user',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='0d99bef1aeee4c6090e60bdbb0ecbeda',ramdisk_id='',reservation_id='r-vup3cqqc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='35458adf-261a-4e0b-a4db-b243619b2394',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-DeleteServersTestJSON-322957873',owner_user_name='tempest-DeleteServersTestJSON-322957873-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-07-27T09:30:27Z,user_data=None,user_id='c87590f95d1147f48a94203d8be751ab',uuid=6a5593cd-3ab6-4859-b0fb-33a0ed702dc8,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "98a9819c-4270-4b75-b851-635a3b19a7b4", "address": "fa:16:3e:de:fb:c0", "network": {"id": "6faafed3-97ab-43b5-bf25-0d0a468935f8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-754617940-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "0d99bef1aeee4c6090e60bdbb0ecbeda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap98a9819c-42", "ovs_interfaceid": "98a9819c-4270-4b75-b851-635a3b19a7b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Jul 27 09:32:21 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Converting VIF {"id": "98a9819c-4270-4b75-b851-635a3b19a7b4", "address": "fa:16:3e:de:fb:c0", "network": {"id": "6faafed3-97ab-43b5-bf25-0d0a468935f8", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-754617940-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "0d99bef1aeee4c6090e60bdbb0ecbeda", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap98a9819c-42", "ovs_interfaceid": "98a9819c-4270-4b75-b851-635a3b19a7b4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Jul 27 09:32:21 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:de:fb:c0,bridge_name='br-int',has_traffic_filtering=True,id=98a9819c-4270-4b75-b851-635a3b19a7b4,network=Network(6faafed3-97ab-43b5-bf25-0d0a468935f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap98a9819c-42') {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Jul 27 09:32:21 user nova-compute[70374]: DEBUG os_vif [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:fb:c0,bridge_name='br-int',has_traffic_filtering=True,id=98a9819c-4270-4b75-b851-635a3b19a7b4,network=Network(6faafed3-97ab-43b5-bf25-0d0a468935f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap98a9819c-42') {{(pid=70374) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Jul 27 09:32:21 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:21 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap98a9819c-42, bridge=br-int, if_exists=True) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:32:21 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:21 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Jul 27 09:32:21 user nova-compute[70374]: INFO os_vif [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:fb:c0,bridge_name='br-int',has_traffic_filtering=True,id=98a9819c-4270-4b75-b851-635a3b19a7b4,network=Network(6faafed3-97ab-43b5-bf25-0d0a468935f8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap98a9819c-42') Jul 27 09:32:21 user nova-compute[70374]: DEBUG nova.virt.libvirt.volume.iscsi [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] calling os-brick to detach iSCSI Volume {{(pid=70374) disconnect_volume /opt/stack/nova/nova/virt/libvirt/volume/iscsi.py:72}} Jul 27 09:32:21 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] ==> disconnect_volume: call "{'args': (, {'target_iqn': 'iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5', 'target_portal': '172.16.0.220:3260', 'target_discovered': False, 'auth_method': 'CHAP', 'auth_username': 'huTvDrlQ', 'auth_password': '***', 'target_lun': 0, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False, 'device_path': '/dev/sdb'}, None), 'kwargs': True}}" {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:175}} Jul 27 09:32:21 user nova-compute[70374]: WARNING os_brick.initiator.connectors.base [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Service needs to call os_brick.setup() before connecting volumes, if it doesn't it will break on the next release Jul 27 09:32:21 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Acquiring lock "connect_volume" by "os_brick.initiator.connectors.iscsi.ISCSIConnector.disconnect_volume" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:68}} Jul 27 09:32:21 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Lock "connect_volume" acquired by "os_brick.initiator.connectors.iscsi.ISCSIConnector.disconnect_volume" :: waited 0.002s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:73}} Jul 27 09:32:21 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m discoverydb -o show -P 1 {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m discoverydb -o show -P 1" returned: 0 in 0.018s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[011b182b-0013-40a4-9f02-15b73f81c6db]: (4, ('SENDTARGETS:\nNo targets found.\niSNS:\nNo targets found.\nSTATIC:\nTarget: iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5\n\tPortal: 172.16.0.220:3260,-1\n\t\tIface Name: default\nTarget: iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251\n\tPortal: 172.16.0.220:3260,-1\n\t\tIface Name: default\nFIRMWARE:\nNo targets found.\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] iscsiadm ['-m', 'discoverydb', '-o', 'show', '-P', 1]: stdout=SENDTARGETS: Jul 27 09:32:22 user nova-compute[70374]: No targets found. Jul 27 09:32:22 user nova-compute[70374]: iSNS: Jul 27 09:32:22 user nova-compute[70374]: No targets found. Jul 27 09:32:22 user nova-compute[70374]: STATIC: Jul 27 09:32:22 user nova-compute[70374]: Target: iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5 Jul 27 09:32:22 user nova-compute[70374]: Portal: 172.16.0.220:3260,-1 Jul 27 09:32:22 user nova-compute[70374]: Iface Name: default Jul 27 09:32:22 user nova-compute[70374]: Target: iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 Jul 27 09:32:22 user nova-compute[70374]: Portal: 172.16.0.220:3260,-1 Jul 27 09:32:22 user nova-compute[70374]: Iface Name: default Jul 27 09:32:22 user nova-compute[70374]: FIRMWARE: Jul 27 09:32:22 user nova-compute[70374]: No targets found. Jul 27 09:32:22 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm_bare /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1182}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Regex to get portals from discoverydb: ^SENDTARGETS: Jul 27 09:32:22 user nova-compute[70374]: .*?^DiscoveryAddress: 172.16.0.220,3260.*? Jul 27 09:32:22 user nova-compute[70374]: (.*?)^(?:DiscoveryAddress|iSNS):.* {{(pid=70374) _get_discoverydb_portals /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:371}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Getting connected devices for (ips,iqns,luns)=[('172.16.0.220:3260', 'iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5', 0)] {{(pid=70374) _get_connection_devices /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:819}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] No BDM found with device name vda, not building metadata. {{(pid=70374) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] No BDM found with device name vdb, not building metadata. {{(pid=70374) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] No VIF found with MAC fa:16:3e:21:44:72, not building metadata {{(pid=70374) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node" returned: 0 in 0.024s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[94cbdef6-c48c-40ff-a9b2-706f81d40164]: (4, ('172.16.0.220:3260,4294967295 iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5\n172.16.0.220:3260,4294967295 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m session {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m session" returned: 0 in 0.027s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[92481d8a-75eb-47ca-b605-dc2811749c69]: (4, ('tcp: [2] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5 (non-flash)\ntcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash)\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] iscsiadm ('-m', 'session'): stdout=tcp: [2] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5 (non-flash) Jul 27 09:32:22 user nova-compute[70374]: tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash) Jul 27 09:32:22 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm_bare /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1182}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] iscsi session list stdout=tcp: [2] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5 (non-flash) Jul 27 09:32:22 user nova-compute[70374]: tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash) Jul 27 09:32:22 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsi_session /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1171}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Resulting device map defaultdict(. at 0x7f27fc3413f0>, {('172.16.0.220:3260', 'iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5'): ({'sdb'}, set())}) {{(pid=70374) _get_connection_devices /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:852}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Removing single pathed devices sdb {{(pid=70374) remove_connection /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:309}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.022s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[b57cb708-a4f9-435b-87ac-044267ba9cf5]: (4, ('path checker states:\nup 2\n\npaths: 0\nbusy: False\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd del path /dev/sdb {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "multipathd del path /dev/sdb" returned: 0 in 0.025s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[69763200-f0f7-4d97-82b3-0bc1dee74ca6]: (4, ('ok\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Flushing IO for device /dev/sdb {{(pid=70374) flush_device_io /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:369}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): blockdev --flushbufs /dev/sdb {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-a4ab3a6d-8345-46b6-9576-d3f98c09a05a tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Lock "8640c525-e6ba-4bf8-9fe0-2c08155dd1cb" "released" by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: held 9.787s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "blockdev --flushbufs /dev/sdb" returned: 0 in 0.015s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[4705dbfd-37dd-4d28-8bb5-5c186007a7e0]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Remove SCSI device /dev/sdb with /sys/block/sdb/device/delete {{(pid=70374) remove_scsi_device /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:83}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): tee -a /sys/block/sdb/device/delete {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "tee -a /sys/block/sdb/device/delete" returned: 0 in 0.058s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[7a8fcf62-5f95-426e-ab53-e13c556efbe6]: (4, ('1', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Checking to see if SCSI volumes sdb have been removed. {{(pid=70374) wait_for_volumes_removal /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:91}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] SCSI volumes sdb have been removed. {{(pid=70374) wait_for_volumes_removal /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:101}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Disconnecting from: [('172.16.0.220:3260', 'iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5')] {{(pid=70374) _disconnect_connection /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1160}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5 -p 172.16.0.220:3260 --op update -n node.startup -v manual {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5 -p 172.16.0.220:3260 --op update -n node.startup -v manual" returned: 0 in 0.025s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[e71e28b6-7c7b-4437-bc5a-b5408a72fdec]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] iscsiadm ('--op', 'update', '-n', 'node.startup', '-v', 'manual'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5 -p 172.16.0.220:3260 --logout {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5 -p 172.16.0.220:3260 --logout" returned: 0 in 0.068s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[861d569d-732c-4a3a-86ee-c57d4fbbae2a]: (4, ('Logging out of session [sid: 2, target: iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5, portal: 172.16.0.220,3260]\nLogout of [sid: 2, target: iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5, portal: 172.16.0.220,3260] successful.\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] iscsiadm ('--logout',): stdout=Logging out of session [sid: 2, target: iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5, portal: 172.16.0.220,3260] Jul 27 09:32:22 user nova-compute[70374]: Logout of [sid: 2, target: iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5, portal: 172.16.0.220,3260] successful. Jul 27 09:32:22 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5 -p 172.16.0.220:3260 --op delete {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:4aac502f-2b4f-4e7f-83c0-30e91f701ca5 -p 172.16.0.220:3260 --op delete" returned: 0 in 0.019s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[a1348e65-9c52-4688-a268-787002d30ba1]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] iscsiadm ('--op', 'delete'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Lock "connect_volume" "released" by "os_brick.initiator.connectors.iscsi.ISCSIConnector.disconnect_volume" :: held 0.375s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:87}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] <== disconnect_volume: return (377ms) None {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:202}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG nova.virt.libvirt.volume.iscsi [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Disconnected iSCSI Volume {{(pid=70374) disconnect_volume /opt/stack/nova/nova/virt/libvirt/volume/iscsi.py:79}} Jul 27 09:32:22 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Deleting instance files /opt/stack/data/nova/instances/6a5593cd-3ab6-4859-b0fb-33a0ed702dc8_del Jul 27 09:32:22 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Deletion of /opt/stack/data/nova/instances/6a5593cd-3ab6-4859-b0fb-33a0ed702dc8_del complete Jul 27 09:32:22 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Checking UEFI support for host arch (x86_64) {{(pid=70374) supports_uefi /opt/stack/nova/nova/virt/libvirt/host.py:1751}} Jul 27 09:32:22 user nova-compute[70374]: INFO nova.virt.libvirt.host [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] UEFI support detected Jul 27 09:32:22 user nova-compute[70374]: INFO nova.compute.manager [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Took 2.02 seconds to destroy the instance on the hypervisor. Jul 27 09:32:22 user nova-compute[70374]: DEBUG oslo.service.loopingcall [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70374) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG nova.compute.manager [-] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Deallocating network for instance {{(pid=70374) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG nova.network.neutron [-] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] deallocate_for_instance() {{(pid=70374) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG nova.compute.manager [req-1d7b5a4e-7729-41bf-80c4-ff910a186e58 req-f41c051b-b4f6-4b8a-bfcf-11632132ecec service nova] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Received event network-vif-plugged-98a9819c-4270-4b75-b851-635a3b19a7b4 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-1d7b5a4e-7729-41bf-80c4-ff910a186e58 req-f41c051b-b4f6-4b8a-bfcf-11632132ecec service nova] Acquiring lock "6a5593cd-3ab6-4859-b0fb-33a0ed702dc8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-1d7b5a4e-7729-41bf-80c4-ff910a186e58 req-f41c051b-b4f6-4b8a-bfcf-11632132ecec service nova] Lock "6a5593cd-3ab6-4859-b0fb-33a0ed702dc8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-1d7b5a4e-7729-41bf-80c4-ff910a186e58 req-f41c051b-b4f6-4b8a-bfcf-11632132ecec service nova] Lock "6a5593cd-3ab6-4859-b0fb-33a0ed702dc8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:22 user nova-compute[70374]: DEBUG nova.compute.manager [req-1d7b5a4e-7729-41bf-80c4-ff910a186e58 req-f41c051b-b4f6-4b8a-bfcf-11632132ecec service nova] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] No waiting events found dispatching network-vif-plugged-98a9819c-4270-4b75-b851-635a3b19a7b4 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:32:22 user nova-compute[70374]: WARNING nova.compute.manager [req-1d7b5a4e-7729-41bf-80c4-ff910a186e58 req-f41c051b-b4f6-4b8a-bfcf-11632132ecec service nova] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Received unexpected event network-vif-plugged-98a9819c-4270-4b75-b851-635a3b19a7b4 for instance with vm_state active and task_state deleting. Jul 27 09:32:23 user nova-compute[70374]: DEBUG nova.network.neutron [-] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Updating instance_info_cache with network_info: [] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:32:23 user nova-compute[70374]: INFO nova.compute.manager [-] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Took 0.61 seconds to deallocate network for instance. Jul 27 09:32:23 user nova-compute[70374]: DEBUG nova.compute.manager [req-ba32ca44-33bc-4054-a015-45c244dd7a31 req-168b3e6a-7c0e-4e63-9723-85235f854762 service nova] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Received event network-vif-deleted-98a9819c-4270-4b75-b851-635a3b19a7b4 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:32:23 user nova-compute[70374]: INFO nova.compute.manager [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Took 0.76 seconds to detach 1 volumes for instance. Jul 27 09:32:23 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:23 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:24 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:24 user nova-compute[70374]: DEBUG nova.compute.provider_tree [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Inventory has not changed in ProviderTree for provider: f7548644-4a09-4ad8-9aa6-6e05d85a9f5b {{(pid=70374) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Jul 27 09:32:24 user nova-compute[70374]: DEBUG nova.scheduler.client.report [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Inventory has not changed for provider f7548644-4a09-4ad8-9aa6-6e05d85a9f5b based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16011, 'reserved': 512, 'min_unit': 1, 'max_unit': 16011, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70374) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Jul 27 09:32:24 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.410s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:24 user nova-compute[70374]: INFO nova.scheduler.client.report [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Deleted allocations for instance 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8 Jul 27 09:32:24 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-1ff87b29-7a6d-4d36-bbb3-39f654063099 tempest-DeleteServersTestJSON-322957873 tempest-DeleteServersTestJSON-322957873-project-member] Lock "6a5593cd-3ab6-4859-b0fb-33a0ed702dc8" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 4.096s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:24 user nova-compute[70374]: DEBUG nova.virt.libvirt.volume.iscsi [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Calling os-brick to attach iSCSI Volume {{(pid=70374) connect_volume /opt/stack/nova/nova/virt/libvirt/volume/iscsi.py:63}} Jul 27 09:32:24 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] ==> connect_volume: call "{'self': , 'connection_properties': {'target_iqn': 'iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291', 'target_portal': '172.16.0.220:3260', 'target_discovered': False, 'auth_method': 'CHAP', 'auth_username': 'RmCwbdGB', 'auth_password': '***', 'target_lun': 0, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False}}" {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:175}} Jul 27 09:32:24 user nova-compute[70374]: WARNING os_brick.initiator.connectors.base [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Service needs to call os_brick.setup() before connecting volumes, if it doesn't it will break on the next release Jul 27 09:32:24 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Acquiring lock "connect_volume" by "os_brick.initiator.connectors.iscsi.ISCSIConnector.connect_volume" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:68}} Jul 27 09:32:24 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Lock "connect_volume" acquired by "os_brick.initiator.connectors.iscsi.ISCSIConnector.connect_volume" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:73}} Jul 27 09:32:24 user nova-compute[70374]: WARNING os_brick.initiator.connectors.base [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Service needs to call os_brick.setup() before connecting volumes, if it doesn't it will break on the next release Jul 27 09:32:24 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Acquiring lock "connect_to_iscsi_portal-172.16.0.220:3260-iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291" by "os_brick.initiator.connectors.iscsi.ISCSIConnector._connect_to_iscsi_portal_unsafe" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:68}} Jul 27 09:32:24 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Lock "connect_to_iscsi_portal-172.16.0.220:3260-iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291" acquired by "os_brick.initiator.connectors.iscsi.ISCSIConnector._connect_to_iscsi_portal_unsafe" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:73}} Jul 27 09:32:24 user nova-compute[70374]: INFO os_brick.initiator.connectors.iscsi [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Trying to connect to iSCSI portal 172.16.0.220:3260 Jul 27 09:32:24 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 -p 172.16.0.220:3260 {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:24 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 -p 172.16.0.220:3260" returned: 21 in 0.017s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:24 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[95434537-dcea-4e3f-83bc-7c74b1a04358]: (4, ('', 'iscsiadm: No records found\n')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:24 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] iscsiadm (): stdout= stderr=iscsiadm: No records found Jul 27 09:32:24 user nova-compute[70374]: {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:24 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 -p 172.16.0.220:3260 --interface default --op new {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:24 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 -p 172.16.0.220:3260 --interface default --op new" returned: 0 in 0.019s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:24 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[dc114099-ef0f-426e-b221-285e11c607c1]: (4, ('New iSCSI node [tcp:[hw=,ip=,net_if=,iscsi_if=default] 172.16.0.220,3260,-1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291] added\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:24 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] iscsiadm ('--interface', 'default', '--op', 'new'): stdout=New iSCSI node [tcp:[hw=,ip=,net_if=,iscsi_if=default] 172.16.0.220,3260,-1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291] added Jul 27 09:32:24 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:24 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 -p 172.16.0.220:3260 --op update -n node.session.scan -v manual {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:24 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 -p 172.16.0.220:3260 --op update -n node.session.scan -v manual" returned: 0 in 0.024s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:24 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[ddb06026-489c-4951-b884-130439ebbef9]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:24 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] iscsiadm ('--op', 'update', '-n', 'node.session.scan', '-v', 'manual'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:24 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 -p 172.16.0.220:3260 --op update -n node.session.auth.authmethod -v CHAP {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:24 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 -p 172.16.0.220:3260 --op update -n node.session.auth.authmethod -v CHAP" returned: 0 in 0.022s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:24 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[b81bbec3-12ef-450b-aa90-91b1fc722956]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:24 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] iscsiadm ('--op', 'update', '-n', 'node.session.auth.authmethod', '-v', 'CHAP'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:24 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 -p 172.16.0.220:3260 --op update -n node.session.auth.username -v RmCwbdGB {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:24 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 -p 172.16.0.220:3260 --op update -n node.session.auth.username -v RmCwbdGB" returned: 0 in 0.022s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:24 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[ed530971-fbf0-4235-baa9-c3c8b7d52527]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:24 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] iscsiadm ('--op', 'update', '-n', 'node.session.auth.username', '-v', 'RmCwbdGB'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:24 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 -p 172.16.0.220:3260 --op update -n node.session.auth.password -v *** {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:24 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 -p 172.16.0.220:3260 --op update -n node.session.auth.password -v ***" returned: 0 in 0.024s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:24 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[18743241-b930-48eb-85f8-f1d99a6e7e75]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:24 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] iscsiadm ('--op', 'update', '-n', 'node.session.auth.password', '-v', '***'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:24 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m session {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:24 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m session" returned: 0 in 0.018s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:24 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[3d749973-9ef7-4071-9309-48c57359a1bd]: (4, ('tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash)\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:24 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] iscsiadm ('-m', 'session'): stdout=tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash) Jul 27 09:32:24 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm_bare /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1182}} Jul 27 09:32:24 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] iscsi session list stdout=tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash) Jul 27 09:32:24 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsi_session /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1171}} Jul 27 09:32:24 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 -p 172.16.0.220:3260 --login {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:25 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 -p 172.16.0.220:3260 --login" returned: 0 in 0.053s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:25 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[8b83a01c-4a7a-4f32-872d-cf5bb9af77e9]: (4, ('Logging in to [iface: default, target: iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291, portal: 172.16.0.220,3260]\nLogin to [iface: default, target: iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291, portal: 172.16.0.220,3260] successful.\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:25 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] iscsiadm ('--login',): stdout=Logging in to [iface: default, target: iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291, portal: 172.16.0.220,3260] Jul 27 09:32:25 user nova-compute[70374]: Login to [iface: default, target: iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291, portal: 172.16.0.220,3260] successful. Jul 27 09:32:25 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:25 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 -p 172.16.0.220:3260 --op update -n node.startup -v automatic {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:25 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 -p 172.16.0.220:3260 --op update -n node.startup -v automatic" returned: 0 in 0.018s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:25 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[ed6ba0f6-f3f8-4e1d-8291-d79be277c2e1]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:25 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] iscsiadm ('--op', 'update', '-n', 'node.startup', '-v', 'automatic'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:25 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m session {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:25 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m session" returned: 0 in 0.021s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:25 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[c75ab3c9-504f-4c16-9a7e-616c295b5a65]: (4, ('tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash)\ntcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash)\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:25 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] iscsiadm ('-m', 'session'): stdout=tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash) Jul 27 09:32:25 user nova-compute[70374]: tcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash) Jul 27 09:32:25 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm_bare /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1182}} Jul 27 09:32:25 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] iscsi session list stdout=tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash) Jul 27 09:32:25 user nova-compute[70374]: tcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash) Jul 27 09:32:25 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsi_session /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1171}} Jul 27 09:32:25 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Lock "connect_to_iscsi_portal-172.16.0.220:3260-iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291" "released" by "os_brick.initiator.connectors.iscsi.ISCSIConnector._connect_to_iscsi_portal_unsafe" :: held 0.281s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:87}} Jul 27 09:32:25 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Connected to 172.16.0.220:3260 {{(pid=70374) _connect_vol /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:633}} Jul 27 09:32:25 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] HCTL ('33', '-', '-', 0) found on session 4 with lun 0 {{(pid=70374) get_hctl /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:694}} Jul 27 09:32:25 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Scanning host 33 c: -, t: -, l: 0) {{(pid=70374) scan_iscsi /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:719}} Jul 27 09:32:25 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): tee -a /sys/class/scsi_host/host33/scan {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:25 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "tee -a /sys/class/scsi_host/host33/scan" returned: 0 in 0.016s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:25 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[bf19f538-e6a0-4d81-b0fe-521dce155987]: (4, ('- - 0', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:25 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Searching for a device in session 4 and hctl ['33', '*', '*', 0] yield: None {{(pid=70374) device_name_by_hctl /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:713}} Jul 27 09:32:25 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Acquiring lock "2e225354-053f-48f5-ad3f-6662b627d531" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:25 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Lock "2e225354-053f-48f5-ad3f-6662b627d531" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:25 user nova-compute[70374]: DEBUG nova.compute.manager [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Starting instance... {{(pid=70374) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} Jul 27 09:32:25 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:25 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:25 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=70374) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Jul 27 09:32:25 user nova-compute[70374]: INFO nova.compute.claims [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Claim successful on node user Jul 27 09:32:25 user nova-compute[70374]: DEBUG nova.compute.provider_tree [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Inventory has not changed in ProviderTree for provider: f7548644-4a09-4ad8-9aa6-6e05d85a9f5b {{(pid=70374) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Jul 27 09:32:25 user nova-compute[70374]: DEBUG nova.scheduler.client.report [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Inventory has not changed for provider f7548644-4a09-4ad8-9aa6-6e05d85a9f5b based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16011, 'reserved': 512, 'min_unit': 1, 'max_unit': 16011, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70374) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Jul 27 09:32:25 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.510s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:25 user nova-compute[70374]: DEBUG nova.compute.manager [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Start building networks asynchronously for instance. {{(pid=70374) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} Jul 27 09:32:25 user nova-compute[70374]: DEBUG nova.compute.manager [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Allocating IP information in the background. {{(pid=70374) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} Jul 27 09:32:25 user nova-compute[70374]: DEBUG nova.network.neutron [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] allocate_for_instance() {{(pid=70374) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Jul 27 09:32:25 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Jul 27 09:32:25 user nova-compute[70374]: DEBUG nova.compute.manager [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Start building block device mappings for instance. {{(pid=70374) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG nova.policy [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4c1d49c23b0045d881d9f47e33447162', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bfaa69b3745a435795aa636ccddec3af', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70374) authorize /opt/stack/nova/nova/policy.py:203}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG nova.compute.manager [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Start spawning the instance on the hypervisor. {{(pid=70374) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Creating instance directory {{(pid=70374) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4717}} Jul 27 09:32:26 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Creating image(s) Jul 27 09:32:26 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Acquiring lock "/opt/stack/data/nova/instances/2e225354-053f-48f5-ad3f-6662b627d531/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Lock "/opt/stack/data/nova/instances/2e225354-053f-48f5-ad3f-6662b627d531/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Lock "/opt/stack/data/nova/instances/2e225354-053f-48f5-ad3f-6662b627d531/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Searching for a device in session 4 and hctl ['33', '*', '*', 0] yield: sdb {{(pid=70374) device_name_by_hctl /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:713}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Connected to sdb using {'target_iqn': 'iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291', 'target_portal': '172.16.0.220:3260', 'target_discovered': False, 'auth_method': 'CHAP', 'auth_username': 'RmCwbdGB', 'auth_password': '***', 'target_lun': 0, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False} {{(pid=70374) _connect_vol /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:661}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Lock "connect_volume" "released" by "os_brick.initiator.connectors.iscsi.ISCSIConnector.connect_volume" :: held 1.311s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:87}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] <== connect_volume: return (1312ms) {'type': 'block', 'scsi_wwn': '26635356331636336', 'path': '/dev/sdb'} {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:202}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG nova.virt.libvirt.volume.iscsi [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Attached iSCSI volume {'type': 'block', 'scsi_wwn': '26635356331636336', 'path': '/dev/sdb'} {{(pid=70374) connect_volume /opt/stack/nova/nova/virt/libvirt/volume/iscsi.py:65}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG nova.objects.instance [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Lazy-loading 'flavor' on Instance uuid 25214e8a-c626-46a7-b273-eb491c2fc91b {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG nova.virt.libvirt.guest [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] attach device xml: Jul 27 09:32:26 user nova-compute[70374]: Jul 27 09:32:26 user nova-compute[70374]: Jul 27 09:32:26 user nova-compute[70374]: Jul 27 09:32:26 user nova-compute[70374]: 788f71f6-0aca-4da9-a915-59560f5cb291 Jul 27 09:32:26 user nova-compute[70374]: Jul 27 09:32:26 user nova-compute[70374]: {{(pid=70374) attach_device /opt/stack/nova/nova/virt/libvirt/guest.py:339}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.310s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Acquiring lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.151s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1,backing_fmt=raw /opt/stack/data/nova/instances/2e225354-053f-48f5-ad3f-6662b627d531/disk 1073741824 {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1,backing_fmt=raw /opt/stack/data/nova/instances/2e225354-053f-48f5-ad3f-6662b627d531/disk 1073741824" returned: 0 in 0.048s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Lock "eecbf4ea539212b0e09996b77db3066885fb0ed1" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.204s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] No BDM found with device name vda, not building metadata. {{(pid=70374) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] No BDM found with device name vdb, not building metadata. {{(pid=70374) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] No VIF found with MAC fa:16:3e:b2:af:a6, not building metadata {{(pid=70374) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG nova.network.neutron [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Successfully created port: ddf41035-1805-4fde-b4cf-d86ab3e59c3b {{(pid=70374) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/eecbf4ea539212b0e09996b77db3066885fb0ed1 --force-share --output=json" returned: 0 in 0.145s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG nova.virt.disk.api [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Checking if we can resize image /opt/stack/data/nova/instances/2e225354-053f-48f5-ad3f-6662b627d531/disk. size=1073741824 {{(pid=70374) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2e225354-053f-48f5-ad3f-6662b627d531/disk --force-share --output=json {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-3fe386c4-9839-4cc6-9451-5e770a2e358c tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Lock "25214e8a-c626-46a7-b273-eb491c2fc91b" "released" by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: held 12.297s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2e225354-053f-48f5-ad3f-6662b627d531/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG nova.virt.disk.api [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Cannot resize image /opt/stack/data/nova/instances/2e225354-053f-48f5-ad3f-6662b627d531/disk to a smaller size. {{(pid=70374) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG nova.objects.instance [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Lazy-loading 'migration_context' on Instance uuid 2e225354-053f-48f5-ad3f-6662b627d531 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Created local disks {{(pid=70374) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4851}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Ensure instance console log exists: /opt/stack/data/nova/instances/2e225354-053f-48f5-ad3f-6662b627d531/console.log {{(pid=70374) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4603}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:26 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:27 user nova-compute[70374]: DEBUG nova.network.neutron [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Successfully updated port: ddf41035-1805-4fde-b4cf-d86ab3e59c3b {{(pid=70374) _update_port /opt/stack/nova/nova/network/neutron.py:586}} Jul 27 09:32:27 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Acquiring lock "refresh_cache-2e225354-053f-48f5-ad3f-6662b627d531" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:32:27 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Acquired lock "refresh_cache-2e225354-053f-48f5-ad3f-6662b627d531" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:32:27 user nova-compute[70374]: DEBUG nova.network.neutron [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Building network info cache for instance {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Jul 27 09:32:27 user nova-compute[70374]: DEBUG nova.compute.manager [req-1bd132d5-7620-41ed-ba7b-54b01313aeac req-55be9905-71d1-4105-a511-b4f5ad7c6c8b service nova] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Received event network-changed-ddf41035-1805-4fde-b4cf-d86ab3e59c3b {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:32:27 user nova-compute[70374]: DEBUG nova.compute.manager [req-1bd132d5-7620-41ed-ba7b-54b01313aeac req-55be9905-71d1-4105-a511-b4f5ad7c6c8b service nova] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Refreshing instance network info cache due to event network-changed-ddf41035-1805-4fde-b4cf-d86ab3e59c3b. {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Jul 27 09:32:27 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-1bd132d5-7620-41ed-ba7b-54b01313aeac req-55be9905-71d1-4105-a511-b4f5ad7c6c8b service nova] Acquiring lock "refresh_cache-2e225354-053f-48f5-ad3f-6662b627d531" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:32:27 user nova-compute[70374]: DEBUG nova.network.neutron [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Instance cache missing network info. {{(pid=70374) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3315}} Jul 27 09:32:27 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-f88ccdd6-0832-48c7-b6ce-bc904bd31c87 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Acquiring lock "25214e8a-c626-46a7-b273-eb491c2fc91b" by "nova.compute.manager.ComputeManager.stop_instance..do_stop_instance" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:27 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-f88ccdd6-0832-48c7-b6ce-bc904bd31c87 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Lock "25214e8a-c626-46a7-b273-eb491c2fc91b" acquired by "nova.compute.manager.ComputeManager.stop_instance..do_stop_instance" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:27 user nova-compute[70374]: DEBUG nova.compute.manager [None req-f88ccdd6-0832-48c7-b6ce-bc904bd31c87 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:32:27 user nova-compute[70374]: DEBUG nova.compute.manager [None req-f88ccdd6-0832-48c7-b6ce-bc904bd31c87 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 {{(pid=70374) do_stop_instance /opt/stack/nova/nova/compute/manager.py:3342}} Jul 27 09:32:27 user nova-compute[70374]: DEBUG nova.objects.instance [None req-f88ccdd6-0832-48c7-b6ce-bc904bd31c87 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Lazy-loading 'flavor' on Instance uuid 25214e8a-c626-46a7-b273-eb491c2fc91b {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG nova.network.neutron [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Updating instance_info_cache with network_info: [{"id": "ddf41035-1805-4fde-b4cf-d86ab3e59c3b", "address": "fa:16:3e:e4:cc:69", "network": {"id": "fb617df7-46b2-48ba-beb1-29eefa43aa47", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-96492016-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bfaa69b3745a435795aa636ccddec3af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapddf41035-18", "ovs_interfaceid": "ddf41035-1805-4fde-b4cf-d86ab3e59c3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Releasing lock "refresh_cache-2e225354-053f-48f5-ad3f-6662b627d531" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG nova.compute.manager [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Instance network_info: |[{"id": "ddf41035-1805-4fde-b4cf-d86ab3e59c3b", "address": "fa:16:3e:e4:cc:69", "network": {"id": "fb617df7-46b2-48ba-beb1-29eefa43aa47", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-96492016-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bfaa69b3745a435795aa636ccddec3af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapddf41035-18", "ovs_interfaceid": "ddf41035-1805-4fde-b4cf-d86ab3e59c3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=70374) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-1bd132d5-7620-41ed-ba7b-54b01313aeac req-55be9905-71d1-4105-a511-b4f5ad7c6c8b service nova] Acquired lock "refresh_cache-2e225354-053f-48f5-ad3f-6662b627d531" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG nova.network.neutron [req-1bd132d5-7620-41ed-ba7b-54b01313aeac req-55be9905-71d1-4105-a511-b4f5ad7c6c8b service nova] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Refreshing network info cache for port ddf41035-1805-4fde-b4cf-d86ab3e59c3b {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Start _get_guest_xml network_info=[{"id": "ddf41035-1805-4fde-b4cf-d86ab3e59c3b", "address": "fa:16:3e:e4:cc:69", "network": {"id": "fb617df7-46b2-48ba-beb1-29eefa43aa47", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-96492016-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bfaa69b3745a435795aa636ccddec3af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapddf41035-18", "ovs_interfaceid": "ddf41035-1805-4fde-b4cf-d86ab3e59c3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-07-27T09:26:27Z,direct_url=,disk_format='qcow2',id=35458adf-261a-4e0b-a4db-b243619b2394,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='37aff788807d4b2aafc10d355583f7c7',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-07-27T09:26:29Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'encrypted': False, 'device_type': 'disk', 'guest_format': None, 'boot_index': 0, 'encryption_format': None, 'size': 0, 'encryption_options': None, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'image_id': '35458adf-261a-4e0b-a4db-b243619b2394'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=70374) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7553}} Jul 27 09:32:28 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:32:28 user nova-compute[70374]: WARNING nova.virt.libvirt.driver [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Jul 27 09:32:28 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Searching host: 'user' for CPU controller through CGroups V1... {{(pid=70374) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1650}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] CPU controller missing on host. {{(pid=70374) _has_cgroupsv1_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1660}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Searching host: 'user' for CPU controller through CGroups V2... {{(pid=70374) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1669}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG nova.virt.libvirt.host [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] CPU controller found on host. {{(pid=70374) _has_cgroupsv2_cpu_controller /opt/stack/nova/nova/virt/libvirt/host.py:1676}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=70374) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5390}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-07-27T09:26:16Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1',id=6,is_public=True,memory_mb=512,name='m1.tiny',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2023-07-27T09:26:27Z,direct_url=,disk_format='qcow2',id=35458adf-261a-4e0b-a4db-b243619b2394,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk',owner='37aff788807d4b2aafc10d355583f7c7',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2023-07-27T09:26:29Z,virtual_size=,visibility=), allow threads: True {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Flavor limits 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Image limits 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Flavor pref 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Image pref 0:0:0 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=70374) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=70374) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Got 1 possible topologies {{(pid=70374) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG nova.virt.hardware [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=70374) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG nova.virt.libvirt.vif [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-07-27T09:32:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-471243575',display_name='tempest-AttachVolumeNegativeTest-server-471243575',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(6),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-471243575',id=14,image_ref='35458adf-261a-4e0b-a4db-b243619b2394',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFXIUeFJxpF4vk5nlGGsHNsBOgmU9WA3rEQ6vxDmkBZP7AGy3dWxt/NMowMl08/WRjc5o5ePL2B6OPh8GFtYgtYo1IQP06pzdbpul/oSVpQqD1ny9pUHoKY28t+y4OcZng==',key_name='tempest-keypair-254621140',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bfaa69b3745a435795aa636ccddec3af',ramdisk_id='',reservation_id='r-w8tuz66i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='35458adf-261a-4e0b-a4db-b243619b2394',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-305502590',owner_user_name='tempest-AttachVolumeNegativeTest-305502590-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-07-27T09:32:26Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='4c1d49c23b0045d881d9f47e33447162',uuid=2e225354-053f-48f5-ad3f-6662b627d531,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ddf41035-1805-4fde-b4cf-d86ab3e59c3b", "address": "fa:16:3e:e4:cc:69", "network": {"id": "fb617df7-46b2-48ba-beb1-29eefa43aa47", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-96492016-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bfaa69b3745a435795aa636ccddec3af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapddf41035-18", "ovs_interfaceid": "ddf41035-1805-4fde-b4cf-d86ab3e59c3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70374) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Converting VIF {"id": "ddf41035-1805-4fde-b4cf-d86ab3e59c3b", "address": "fa:16:3e:e4:cc:69", "network": {"id": "fb617df7-46b2-48ba-beb1-29eefa43aa47", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-96492016-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bfaa69b3745a435795aa636ccddec3af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapddf41035-18", "ovs_interfaceid": "ddf41035-1805-4fde-b4cf-d86ab3e59c3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e4:cc:69,bridge_name='br-int',has_traffic_filtering=True,id=ddf41035-1805-4fde-b4cf-d86ab3e59c3b,network=Network(fb617df7-46b2-48ba-beb1-29eefa43aa47),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddf41035-18') {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG nova.objects.instance [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Lazy-loading 'pci_devices' on Instance uuid 2e225354-053f-48f5-ad3f-6662b627d531 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] End _get_guest_xml xml= Jul 27 09:32:28 user nova-compute[70374]: 2e225354-053f-48f5-ad3f-6662b627d531 Jul 27 09:32:28 user nova-compute[70374]: instance-0000000e Jul 27 09:32:28 user nova-compute[70374]: 524288 Jul 27 09:32:28 user nova-compute[70374]: 1 Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: tempest-AttachVolumeNegativeTest-server-471243575 Jul 27 09:32:28 user nova-compute[70374]: 2023-07-27 09:32:28 Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: 512 Jul 27 09:32:28 user nova-compute[70374]: 1 Jul 27 09:32:28 user nova-compute[70374]: 0 Jul 27 09:32:28 user nova-compute[70374]: 0 Jul 27 09:32:28 user nova-compute[70374]: 1 Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: tempest-AttachVolumeNegativeTest-305502590-project-member Jul 27 09:32:28 user nova-compute[70374]: tempest-AttachVolumeNegativeTest-305502590 Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: OpenStack Foundation Jul 27 09:32:28 user nova-compute[70374]: OpenStack Nova Jul 27 09:32:28 user nova-compute[70374]: 0.0.0 Jul 27 09:32:28 user nova-compute[70374]: 2e225354-053f-48f5-ad3f-6662b627d531 Jul 27 09:32:28 user nova-compute[70374]: 2e225354-053f-48f5-ad3f-6662b627d531 Jul 27 09:32:28 user nova-compute[70374]: Virtual Machine Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: hvm Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Nehalem Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: /dev/urandom Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: Jul 27 09:32:28 user nova-compute[70374]: {{(pid=70374) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7559}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG nova.virt.libvirt.vif [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-07-27T09:32:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-471243575',display_name='tempest-AttachVolumeNegativeTest-server-471243575',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(6),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-471243575',id=14,image_ref='35458adf-261a-4e0b-a4db-b243619b2394',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFXIUeFJxpF4vk5nlGGsHNsBOgmU9WA3rEQ6vxDmkBZP7AGy3dWxt/NMowMl08/WRjc5o5ePL2B6OPh8GFtYgtYo1IQP06pzdbpul/oSVpQqD1ny9pUHoKY28t+y4OcZng==',key_name='tempest-keypair-254621140',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bfaa69b3745a435795aa636ccddec3af',ramdisk_id='',reservation_id='r-w8tuz66i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='35458adf-261a-4e0b-a4db-b243619b2394',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-305502590',owner_user_name='tempest-AttachVolumeNegativeTest-305502590-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-07-27T09:32:26Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='4c1d49c23b0045d881d9f47e33447162',uuid=2e225354-053f-48f5-ad3f-6662b627d531,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ddf41035-1805-4fde-b4cf-d86ab3e59c3b", "address": "fa:16:3e:e4:cc:69", "network": {"id": "fb617df7-46b2-48ba-beb1-29eefa43aa47", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-96492016-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bfaa69b3745a435795aa636ccddec3af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapddf41035-18", "ovs_interfaceid": "ddf41035-1805-4fde-b4cf-d86ab3e59c3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Converting VIF {"id": "ddf41035-1805-4fde-b4cf-d86ab3e59c3b", "address": "fa:16:3e:e4:cc:69", "network": {"id": "fb617df7-46b2-48ba-beb1-29eefa43aa47", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-96492016-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bfaa69b3745a435795aa636ccddec3af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapddf41035-18", "ovs_interfaceid": "ddf41035-1805-4fde-b4cf-d86ab3e59c3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e4:cc:69,bridge_name='br-int',has_traffic_filtering=True,id=ddf41035-1805-4fde-b4cf-d86ab3e59c3b,network=Network(fb617df7-46b2-48ba-beb1-29eefa43aa47),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddf41035-18') {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG os_vif [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:cc:69,bridge_name='br-int',has_traffic_filtering=True,id=ddf41035-1805-4fde-b4cf-d86ab3e59c3b,network=Network(fb617df7-46b2-48ba-beb1-29eefa43aa47),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddf41035-18') {{(pid=70374) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapddf41035-18, may_exist=True, interface_attrs={}) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapddf41035-18, col_values=(('external_ids', {'iface-id': 'ddf41035-1805-4fde-b4cf-d86ab3e59c3b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e4:cc:69', 'vm-uuid': '2e225354-053f-48f5-ad3f-6662b627d531'}),), if_exists=True) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:28 user nova-compute[70374]: INFO os_vif [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:cc:69,bridge_name='br-int',has_traffic_filtering=True,id=ddf41035-1805-4fde-b4cf-d86ab3e59c3b,network=Network(fb617df7-46b2-48ba-beb1-29eefa43aa47),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddf41035-18') Jul 27 09:32:28 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] No BDM found with device name vda, not building metadata. {{(pid=70374) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] No VIF found with MAC fa:16:3e:e4:cc:69, not building metadata {{(pid=70374) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG nova.network.neutron [req-1bd132d5-7620-41ed-ba7b-54b01313aeac req-55be9905-71d1-4105-a511-b4f5ad7c6c8b service nova] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Updated VIF entry in instance network info cache for port ddf41035-1805-4fde-b4cf-d86ab3e59c3b. {{(pid=70374) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG nova.network.neutron [req-1bd132d5-7620-41ed-ba7b-54b01313aeac req-55be9905-71d1-4105-a511-b4f5ad7c6c8b service nova] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Updating instance_info_cache with network_info: [{"id": "ddf41035-1805-4fde-b4cf-d86ab3e59c3b", "address": "fa:16:3e:e4:cc:69", "network": {"id": "fb617df7-46b2-48ba-beb1-29eefa43aa47", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-96492016-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "bfaa69b3745a435795aa636ccddec3af", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapddf41035-18", "ovs_interfaceid": "ddf41035-1805-4fde-b4cf-d86ab3e59c3b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:32:28 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-1bd132d5-7620-41ed-ba7b-54b01313aeac req-55be9905-71d1-4105-a511-b4f5ad7c6c8b service nova] Releasing lock "refresh_cache-2e225354-053f-48f5-ad3f-6662b627d531" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:32:29 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:29 user nova-compute[70374]: INFO nova.virt.libvirt.driver [-] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Instance destroyed successfully. Jul 27 09:32:29 user nova-compute[70374]: DEBUG nova.compute.manager [None req-f88ccdd6-0832-48c7-b6ce-bc904bd31c87 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:32:29 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-f88ccdd6-0832-48c7-b6ce-bc904bd31c87 tempest-AttachVolumeTestJSON-847846253 tempest-AttachVolumeTestJSON-847846253-project-member] Lock "25214e8a-c626-46a7-b273-eb491c2fc91b" "released" by "nova.compute.manager.ComputeManager.stop_instance..do_stop_instance" :: held 1.423s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:29 user nova-compute[70374]: DEBUG nova.compute.manager [req-b204dae1-befa-4208-85e7-7cdccb5f88c0 req-85732936-808d-4351-9d2a-b3627076df73 service nova] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Received event network-vif-unplugged-b825d9b4-15c4-4b47-a3f7-af9838d09458 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:32:29 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-b204dae1-befa-4208-85e7-7cdccb5f88c0 req-85732936-808d-4351-9d2a-b3627076df73 service nova] Acquiring lock "25214e8a-c626-46a7-b273-eb491c2fc91b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:29 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-b204dae1-befa-4208-85e7-7cdccb5f88c0 req-85732936-808d-4351-9d2a-b3627076df73 service nova] Lock "25214e8a-c626-46a7-b273-eb491c2fc91b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:29 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-b204dae1-befa-4208-85e7-7cdccb5f88c0 req-85732936-808d-4351-9d2a-b3627076df73 service nova] Lock "25214e8a-c626-46a7-b273-eb491c2fc91b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:29 user nova-compute[70374]: DEBUG nova.compute.manager [req-b204dae1-befa-4208-85e7-7cdccb5f88c0 req-85732936-808d-4351-9d2a-b3627076df73 service nova] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] No waiting events found dispatching network-vif-unplugged-b825d9b4-15c4-4b47-a3f7-af9838d09458 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:32:29 user nova-compute[70374]: WARNING nova.compute.manager [req-b204dae1-befa-4208-85e7-7cdccb5f88c0 req-85732936-808d-4351-9d2a-b3627076df73 service nova] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Received unexpected event network-vif-unplugged-b825d9b4-15c4-4b47-a3f7-af9838d09458 for instance with vm_state stopped and task_state None. Jul 27 09:32:29 user nova-compute[70374]: DEBUG nova.compute.manager [req-b204dae1-befa-4208-85e7-7cdccb5f88c0 req-85732936-808d-4351-9d2a-b3627076df73 service nova] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Received event network-vif-plugged-b825d9b4-15c4-4b47-a3f7-af9838d09458 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:32:29 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-b204dae1-befa-4208-85e7-7cdccb5f88c0 req-85732936-808d-4351-9d2a-b3627076df73 service nova] Acquiring lock "25214e8a-c626-46a7-b273-eb491c2fc91b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:29 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-b204dae1-befa-4208-85e7-7cdccb5f88c0 req-85732936-808d-4351-9d2a-b3627076df73 service nova] Lock "25214e8a-c626-46a7-b273-eb491c2fc91b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:29 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-b204dae1-befa-4208-85e7-7cdccb5f88c0 req-85732936-808d-4351-9d2a-b3627076df73 service nova] Lock "25214e8a-c626-46a7-b273-eb491c2fc91b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:29 user nova-compute[70374]: DEBUG nova.compute.manager [req-b204dae1-befa-4208-85e7-7cdccb5f88c0 req-85732936-808d-4351-9d2a-b3627076df73 service nova] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] No waiting events found dispatching network-vif-plugged-b825d9b4-15c4-4b47-a3f7-af9838d09458 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:32:29 user nova-compute[70374]: WARNING nova.compute.manager [req-b204dae1-befa-4208-85e7-7cdccb5f88c0 req-85732936-808d-4351-9d2a-b3627076df73 service nova] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Received unexpected event network-vif-plugged-b825d9b4-15c4-4b47-a3f7-af9838d09458 for instance with vm_state stopped and task_state None. Jul 27 09:32:29 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:29 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:30 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:30 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:30 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:30 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:30 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:30 user nova-compute[70374]: DEBUG nova.compute.manager [req-d9437a69-5202-4cbd-913d-b9f1b2ae4e6d req-ddeaf04c-d1bf-4480-8146-bfe0d9381459 service nova] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Received event network-changed-cc9444e6-2840-433c-b7c9-c0df023f6e43 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:32:30 user nova-compute[70374]: DEBUG nova.compute.manager [req-d9437a69-5202-4cbd-913d-b9f1b2ae4e6d req-ddeaf04c-d1bf-4480-8146-bfe0d9381459 service nova] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Refreshing instance network info cache due to event network-changed-cc9444e6-2840-433c-b7c9-c0df023f6e43. {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Jul 27 09:32:30 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-d9437a69-5202-4cbd-913d-b9f1b2ae4e6d req-ddeaf04c-d1bf-4480-8146-bfe0d9381459 service nova] Acquiring lock "refresh_cache-42f4c546-47e4-485b-be29-4081c7557bad" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:32:30 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-d9437a69-5202-4cbd-913d-b9f1b2ae4e6d req-ddeaf04c-d1bf-4480-8146-bfe0d9381459 service nova] Acquired lock "refresh_cache-42f4c546-47e4-485b-be29-4081c7557bad" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:32:30 user nova-compute[70374]: DEBUG nova.network.neutron [req-d9437a69-5202-4cbd-913d-b9f1b2ae4e6d req-ddeaf04c-d1bf-4480-8146-bfe0d9381459 service nova] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Refreshing network info cache for port cc9444e6-2840-433c-b7c9-c0df023f6e43 {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Jul 27 09:32:30 user nova-compute[70374]: DEBUG nova.network.neutron [req-d9437a69-5202-4cbd-913d-b9f1b2ae4e6d req-ddeaf04c-d1bf-4480-8146-bfe0d9381459 service nova] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Updated VIF entry in instance network info cache for port cc9444e6-2840-433c-b7c9-c0df023f6e43. {{(pid=70374) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Jul 27 09:32:30 user nova-compute[70374]: DEBUG nova.network.neutron [req-d9437a69-5202-4cbd-913d-b9f1b2ae4e6d req-ddeaf04c-d1bf-4480-8146-bfe0d9381459 service nova] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Updating instance_info_cache with network_info: [{"id": "cc9444e6-2840-433c-b7c9-c0df023f6e43", "address": "fa:16:3e:bf:c6:71", "network": {"id": "f60fad83-a431-45cc-8f33-49bbd1e52e4c", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-2025296099-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "aa3b4d25fdc94c4ea2e148e16478b40c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc9444e6-28", "ovs_interfaceid": "cc9444e6-2840-433c-b7c9-c0df023f6e43", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:32:30 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-d9437a69-5202-4cbd-913d-b9f1b2ae4e6d req-ddeaf04c-d1bf-4480-8146-bfe0d9381459 service nova] Releasing lock "refresh_cache-42f4c546-47e4-485b-be29-4081c7557bad" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:32:32 user nova-compute[70374]: DEBUG nova.virt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Emitting event Resumed> {{(pid=70374) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Jul 27 09:32:32 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] VM Resumed (Lifecycle Event) Jul 27 09:32:32 user nova-compute[70374]: DEBUG nova.compute.manager [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Instance event wait completed in 0 seconds for {{(pid=70374) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Jul 27 09:32:32 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Guest created on hypervisor {{(pid=70374) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4411}} Jul 27 09:32:32 user nova-compute[70374]: INFO nova.virt.libvirt.driver [-] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Instance spawned successfully. Jul 27 09:32:32 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:894}} Jul 27 09:32:32 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:32:32 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70374) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Jul 27 09:32:32 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Found default for hw_cdrom_bus of ide {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:32:32 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Found default for hw_disk_bus of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:32:32 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Found default for hw_input_bus of None {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:32:32 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Found default for hw_pointer_model of None {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:32:32 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Found default for hw_video_model of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:32:32 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Found default for hw_vif_model of virtio {{(pid=70374) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:923}} Jul 27 09:32:32 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] During sync_power_state the instance has a pending task (spawning). Skip. Jul 27 09:32:32 user nova-compute[70374]: DEBUG nova.virt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Emitting event Started> {{(pid=70374) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Jul 27 09:32:32 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] VM Started (Lifecycle Event) Jul 27 09:32:32 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:32:32 user nova-compute[70374]: DEBUG nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=70374) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Jul 27 09:32:32 user nova-compute[70374]: INFO nova.compute.manager [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] During sync_power_state the instance has a pending task (spawning). Skip. Jul 27 09:32:32 user nova-compute[70374]: INFO nova.compute.manager [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Took 6.19 seconds to spawn the instance on the hypervisor. Jul 27 09:32:32 user nova-compute[70374]: DEBUG nova.compute.manager [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:32:32 user nova-compute[70374]: INFO nova.compute.manager [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Took 7.01 seconds to build instance. Jul 27 09:32:32 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-3926d077-4dad-4892-b624-90c9fb8cf216 tempest-AttachVolumeNegativeTest-305502590 tempest-AttachVolumeNegativeTest-305502590-project-member] Lock "2e225354-053f-48f5-ad3f-6662b627d531" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.097s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:32 user nova-compute[70374]: DEBUG nova.compute.manager [req-476e16c0-dd92-482e-8033-83da6794fadf req-6fd8b993-f863-43b2-ba68-588341b0c30d service nova] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Received event network-vif-plugged-ddf41035-1805-4fde-b4cf-d86ab3e59c3b {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:32:32 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-476e16c0-dd92-482e-8033-83da6794fadf req-6fd8b993-f863-43b2-ba68-588341b0c30d service nova] Acquiring lock "2e225354-053f-48f5-ad3f-6662b627d531-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:32 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-476e16c0-dd92-482e-8033-83da6794fadf req-6fd8b993-f863-43b2-ba68-588341b0c30d service nova] Lock "2e225354-053f-48f5-ad3f-6662b627d531-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:32 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-476e16c0-dd92-482e-8033-83da6794fadf req-6fd8b993-f863-43b2-ba68-588341b0c30d service nova] Lock "2e225354-053f-48f5-ad3f-6662b627d531-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:32 user nova-compute[70374]: DEBUG nova.compute.manager [req-476e16c0-dd92-482e-8033-83da6794fadf req-6fd8b993-f863-43b2-ba68-588341b0c30d service nova] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] No waiting events found dispatching network-vif-plugged-ddf41035-1805-4fde-b4cf-d86ab3e59c3b {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:32:32 user nova-compute[70374]: WARNING nova.compute.manager [req-476e16c0-dd92-482e-8033-83da6794fadf req-6fd8b993-f863-43b2-ba68-588341b0c30d service nova] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Received unexpected event network-vif-plugged-ddf41035-1805-4fde-b4cf-d86ab3e59c3b for instance with vm_state active and task_state None. Jul 27 09:32:32 user nova-compute[70374]: DEBUG nova.compute.manager [req-476e16c0-dd92-482e-8033-83da6794fadf req-6fd8b993-f863-43b2-ba68-588341b0c30d service nova] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Received event network-vif-plugged-ddf41035-1805-4fde-b4cf-d86ab3e59c3b {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:32:32 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-476e16c0-dd92-482e-8033-83da6794fadf req-6fd8b993-f863-43b2-ba68-588341b0c30d service nova] Acquiring lock "2e225354-053f-48f5-ad3f-6662b627d531-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:32 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-476e16c0-dd92-482e-8033-83da6794fadf req-6fd8b993-f863-43b2-ba68-588341b0c30d service nova] Lock "2e225354-053f-48f5-ad3f-6662b627d531-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:32 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-476e16c0-dd92-482e-8033-83da6794fadf req-6fd8b993-f863-43b2-ba68-588341b0c30d service nova] Lock "2e225354-053f-48f5-ad3f-6662b627d531-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:32 user nova-compute[70374]: DEBUG nova.compute.manager [req-476e16c0-dd92-482e-8033-83da6794fadf req-6fd8b993-f863-43b2-ba68-588341b0c30d service nova] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] No waiting events found dispatching network-vif-plugged-ddf41035-1805-4fde-b4cf-d86ab3e59c3b {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:32:32 user nova-compute[70374]: WARNING nova.compute.manager [req-476e16c0-dd92-482e-8033-83da6794fadf req-6fd8b993-f863-43b2-ba68-588341b0c30d service nova] [instance: 2e225354-053f-48f5-ad3f-6662b627d531] Received unexpected event network-vif-plugged-ddf41035-1805-4fde-b4cf-d86ab3e59c3b for instance with vm_state active and task_state None. Jul 27 09:32:33 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Acquiring lock "42f4c546-47e4-485b-be29-4081c7557bad" by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:33 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Lock "42f4c546-47e4-485b-be29-4081c7557bad" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:33 user nova-compute[70374]: DEBUG nova.objects.instance [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Lazy-loading 'flavor' on Instance uuid 42f4c546-47e4-485b-be29-4081c7557bad {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:32:33 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Lock "42f4c546-47e4-485b-be29-4081c7557bad" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: held 0.057s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:33 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Acquiring lock "42f4c546-47e4-485b-be29-4081c7557bad" by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:33 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Lock "42f4c546-47e4-485b-be29-4081c7557bad" acquired by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:33 user nova-compute[70374]: INFO nova.compute.manager [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Attaching volume 6ae2fa33-45a0-4d20-bbaf-a51427718f3c to /dev/vdb Jul 27 09:32:33 user nova-compute[70374]: DEBUG os_brick.utils [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '10.0.0.210', 'multipath': False, 'enforce_multipath': True, 'host': 'user', 'execute': None}" {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:175}} Jul 27 09:32:33 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:33 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:33 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.012s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:33 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[2728b0e5-761f-47ac-95b0-a2f311ed04ec]: (4, ('InitiatorName=iqn.2016-04.com.open-iscsi:ce3dd372cc44\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:33 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt / -n -o SOURCE {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:33 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "findmnt / -n -o SOURCE" returned: 0 in 0.023s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:33 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[6ec6e18d-474e-42d0-a59a-f370ccdd1bef]: (4, ('/dev/mapper/vg0-lv--0\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:33 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): blkid /dev/mapper/vg0-lv--0 -s UUID -o value {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:33 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "blkid /dev/mapper/vg0-lv--0 -s UUID -o value" returned: 0 in 0.015s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:33 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[1c857977-3191-45ad-9635-8238b7ec44ef]: (4, ('e95b3b51-542d-42ca-ac40-f83360608668\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:33 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[94b861c6-6c60-463b-9014-d4a5f026aa6c]: (4, 'e20c3142-5af9-7467-ecd8-70b2e4a210d6') {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:33 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Running cmd (subprocess): nvme version {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:33 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] 'nvme version' failed. Not Retrying. {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:473}} Jul 27 09:32:33 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.nvmeof [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] nvme not present on system {{(pid=70374) nvme_present /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/nvmeof.py:757}} Jul 27 09:32:33 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): nvme show-hostnqn {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:33 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] 'nvme show-hostnqn' failed. Not Retrying. {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:473}} Jul 27 09:32:33 user nova-compute[70374]: WARNING os_brick.privileged.nvmeof [-] Could not generate host nqn: [Errno 2] No such file or directory: 'nvme' Jul 27 09:32:33 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[6d054211-02b8-48c9-b2e2-a2a11ce75860]: (4, '') {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:33 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.lightos [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] LIGHTOS: [Errno 111] ECONNREFUSED {{(pid=70374) find_dsc /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/lightos.py:98}} Jul 27 09:32:33 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.lightos [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] LIGHTOS: did not find dsc, continuing anyway. {{(pid=70374) get_connector_properties /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/lightos.py:76}} Jul 27 09:32:33 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.lightos [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] LIGHTOS: no hostnqn found. {{(pid=70374) get_connector_properties /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/lightos.py:84}} Jul 27 09:32:33 user nova-compute[70374]: DEBUG os_brick.utils [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] <== get_connector_properties: return (119ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '10.0.0.210', 'host': 'user', 'multipath': False, 'initiator': 'iqn.2016-04.com.open-iscsi:ce3dd372cc44', 'do_local_attach': False, 'uuid': 'e95b3b51-542d-42ca-ac40-f83360608668', 'system uuid': 'e20c3142-5af9-7467-ecd8-70b2e4a210d6', 'nvme_native_multipath': False} {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:202}} Jul 27 09:32:33 user nova-compute[70374]: DEBUG nova.virt.block_device [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Updating existing volume attachment record: 7fccd7db-a1e1-42d0-acd3-c082af2b8fe9 {{(pid=70374) _volume_attach /opt/stack/nova/nova/virt/block_device.py:631}} Jul 27 09:32:34 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:35 user nova-compute[70374]: DEBUG nova.compute.manager [req-06400f19-d10c-49be-8a4b-931225694726 req-e2efc802-427b-4daa-96c6-4e1240e0af5e service nova] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Received event network-changed-626b287e-22ee-49d7-8ec3-c1a0660bd5d8 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:32:35 user nova-compute[70374]: DEBUG nova.compute.manager [req-06400f19-d10c-49be-8a4b-931225694726 req-e2efc802-427b-4daa-96c6-4e1240e0af5e service nova] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Refreshing instance network info cache due to event network-changed-626b287e-22ee-49d7-8ec3-c1a0660bd5d8. {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Jul 27 09:32:35 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-06400f19-d10c-49be-8a4b-931225694726 req-e2efc802-427b-4daa-96c6-4e1240e0af5e service nova] Acquiring lock "refresh_cache-309c9c26-4a0f-45db-bb3a-595b19f3f627" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:32:35 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-06400f19-d10c-49be-8a4b-931225694726 req-e2efc802-427b-4daa-96c6-4e1240e0af5e service nova] Acquired lock "refresh_cache-309c9c26-4a0f-45db-bb3a-595b19f3f627" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:32:35 user nova-compute[70374]: DEBUG nova.network.neutron [req-06400f19-d10c-49be-8a4b-931225694726 req-e2efc802-427b-4daa-96c6-4e1240e0af5e service nova] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Refreshing network info cache for port 626b287e-22ee-49d7-8ec3-c1a0660bd5d8 {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Jul 27 09:32:35 user nova-compute[70374]: DEBUG nova.network.neutron [req-06400f19-d10c-49be-8a4b-931225694726 req-e2efc802-427b-4daa-96c6-4e1240e0af5e service nova] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Updated VIF entry in instance network info cache for port 626b287e-22ee-49d7-8ec3-c1a0660bd5d8. {{(pid=70374) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Jul 27 09:32:35 user nova-compute[70374]: DEBUG nova.network.neutron [req-06400f19-d10c-49be-8a4b-931225694726 req-e2efc802-427b-4daa-96c6-4e1240e0af5e service nova] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Updating instance_info_cache with network_info: [{"id": "626b287e-22ee-49d7-8ec3-c1a0660bd5d8", "address": "fa:16:3e:57:6c:c6", "network": {"id": "f6f1459d-22db-4bf7-8b7b-d6dec4fa18d5", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-2084493807-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.49", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cdd638e5400740279443a374e3e570d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap626b287e-22", "ovs_interfaceid": "626b287e-22ee-49d7-8ec3-c1a0660bd5d8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:32:35 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-06400f19-d10c-49be-8a4b-931225694726 req-e2efc802-427b-4daa-96c6-4e1240e0af5e service nova] Releasing lock "refresh_cache-309c9c26-4a0f-45db-bb3a-595b19f3f627" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:32:36 user nova-compute[70374]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70374) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Jul 27 09:32:36 user nova-compute[70374]: INFO nova.compute.manager [-] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] VM Stopped (Lifecycle Event) Jul 27 09:32:36 user nova-compute[70374]: DEBUG nova.compute.manager [None req-e4e4fa06-b0f4-4e37-bbbe-9205f6e2bbd0 None None] [instance: 6a5593cd-3ab6-4859-b0fb-33a0ed702dc8] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:32:37 user nova-compute[70374]: DEBUG nova.compute.manager [req-3ee3fd74-0deb-4423-8ba6-20a1d9399c09 req-2c632ff0-78d8-43d2-a908-03ccad4432c9 service nova] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Received event network-changed-018c2bfb-405c-4bba-a54f-e8ea773e57bf {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:32:37 user nova-compute[70374]: DEBUG nova.compute.manager [req-3ee3fd74-0deb-4423-8ba6-20a1d9399c09 req-2c632ff0-78d8-43d2-a908-03ccad4432c9 service nova] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Refreshing instance network info cache due to event network-changed-018c2bfb-405c-4bba-a54f-e8ea773e57bf. {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Jul 27 09:32:37 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-3ee3fd74-0deb-4423-8ba6-20a1d9399c09 req-2c632ff0-78d8-43d2-a908-03ccad4432c9 service nova] Acquiring lock "refresh_cache-d35fe056-8279-479a-a673-6c61e5ec6933" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:32:37 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-3ee3fd74-0deb-4423-8ba6-20a1d9399c09 req-2c632ff0-78d8-43d2-a908-03ccad4432c9 service nova] Acquired lock "refresh_cache-d35fe056-8279-479a-a673-6c61e5ec6933" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:32:37 user nova-compute[70374]: DEBUG nova.network.neutron [req-3ee3fd74-0deb-4423-8ba6-20a1d9399c09 req-2c632ff0-78d8-43d2-a908-03ccad4432c9 service nova] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Refreshing network info cache for port 018c2bfb-405c-4bba-a54f-e8ea773e57bf {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Jul 27 09:32:38 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Acquiring lock "d35fe056-8279-479a-a673-6c61e5ec6933" by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:38 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Lock "d35fe056-8279-479a-a673-6c61e5ec6933" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:38 user nova-compute[70374]: DEBUG nova.objects.instance [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Lazy-loading 'flavor' on Instance uuid d35fe056-8279-479a-a673-6c61e5ec6933 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:32:38 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Lock "d35fe056-8279-479a-a673-6c61e5ec6933" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: held 0.074s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:38 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:38 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Acquiring lock "d35fe056-8279-479a-a673-6c61e5ec6933" by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:38 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Lock "d35fe056-8279-479a-a673-6c61e5ec6933" acquired by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:38 user nova-compute[70374]: INFO nova.compute.manager [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Attaching volume 25fa94b4-c6dc-4dbd-add9-e3a58523baac to /dev/vdb Jul 27 09:32:38 user nova-compute[70374]: DEBUG nova.network.neutron [req-3ee3fd74-0deb-4423-8ba6-20a1d9399c09 req-2c632ff0-78d8-43d2-a908-03ccad4432c9 service nova] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Updated VIF entry in instance network info cache for port 018c2bfb-405c-4bba-a54f-e8ea773e57bf. {{(pid=70374) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Jul 27 09:32:38 user nova-compute[70374]: DEBUG nova.network.neutron [req-3ee3fd74-0deb-4423-8ba6-20a1d9399c09 req-2c632ff0-78d8-43d2-a908-03ccad4432c9 service nova] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Updating instance_info_cache with network_info: [{"id": "018c2bfb-405c-4bba-a54f-e8ea773e57bf", "address": "fa:16:3e:2d:07:78", "network": {"id": "aead18f5-45fa-4ce5-9c4a-086f9f9d07e6", "bridge": "br-int", "label": "tempest-TestVolumeSwap-383523410-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.135", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "2010f7a57b654bee8736f3ed8d805b2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap018c2bfb-40", "ovs_interfaceid": "018c2bfb-405c-4bba-a54f-e8ea773e57bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:32:38 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-3ee3fd74-0deb-4423-8ba6-20a1d9399c09 req-2c632ff0-78d8-43d2-a908-03ccad4432c9 service nova] Releasing lock "refresh_cache-d35fe056-8279-479a-a673-6c61e5ec6933" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:32:38 user nova-compute[70374]: DEBUG os_brick.utils [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '10.0.0.210', 'multipath': False, 'enforce_multipath': True, 'host': 'user', 'execute': None}" {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:175}} Jul 27 09:32:38 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:38 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.013s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:38 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[f7d8883f-632c-493f-8263-735853dbdf72]: (4, ('InitiatorName=iqn.2016-04.com.open-iscsi:ce3dd372cc44\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:38 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt / -n -o SOURCE {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:38 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "findmnt / -n -o SOURCE" returned: 0 in 0.017s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:38 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[fa1ac474-c2e1-45d8-a5fe-ff5f4e5b8c81]: (4, ('/dev/mapper/vg0-lv--0\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:38 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): blkid /dev/mapper/vg0-lv--0 -s UUID -o value {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:38 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "blkid /dev/mapper/vg0-lv--0 -s UUID -o value" returned: 0 in 0.017s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:38 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[7f184057-10d2-486a-bb3f-f95ea4427791]: (4, ('e95b3b51-542d-42ca-ac40-f83360608668\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:38 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[fd10740e-9b6f-4e45-ad3f-4bbaf0c5fc1b]: (4, 'e20c3142-5af9-7467-ecd8-70b2e4a210d6') {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:38 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Running cmd (subprocess): nvme version {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:38 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] 'nvme version' failed. Not Retrying. {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:473}} Jul 27 09:32:38 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.nvmeof [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] nvme not present on system {{(pid=70374) nvme_present /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/nvmeof.py:757}} Jul 27 09:32:38 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): nvme show-hostnqn {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:38 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] 'nvme show-hostnqn' failed. Not Retrying. {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:473}} Jul 27 09:32:38 user nova-compute[70374]: WARNING os_brick.privileged.nvmeof [-] Could not generate host nqn: [Errno 2] No such file or directory: 'nvme' Jul 27 09:32:38 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[5b213c1c-02eb-4b65-8c48-092a6b737d0e]: (4, '') {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:38 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.lightos [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] LIGHTOS: [Errno 111] ECONNREFUSED {{(pid=70374) find_dsc /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/lightos.py:98}} Jul 27 09:32:38 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.lightos [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] LIGHTOS: did not find dsc, continuing anyway. {{(pid=70374) get_connector_properties /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/lightos.py:76}} Jul 27 09:32:38 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.lightos [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] LIGHTOS: no hostnqn found. {{(pid=70374) get_connector_properties /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/lightos.py:84}} Jul 27 09:32:38 user nova-compute[70374]: DEBUG os_brick.utils [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] <== get_connector_properties: return (114ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '10.0.0.210', 'host': 'user', 'multipath': False, 'initiator': 'iqn.2016-04.com.open-iscsi:ce3dd372cc44', 'do_local_attach': False, 'uuid': 'e95b3b51-542d-42ca-ac40-f83360608668', 'system uuid': 'e20c3142-5af9-7467-ecd8-70b2e4a210d6', 'nvme_native_multipath': False} {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:202}} Jul 27 09:32:38 user nova-compute[70374]: DEBUG nova.virt.block_device [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Updating existing volume attachment record: 13abfe22-c4d5-47a4-943f-3c4a9aa930de {{(pid=70374) _volume_attach /opt/stack/nova/nova/virt/block_device.py:631}} Jul 27 09:32:39 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:39 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Acquiring lock "309c9c26-4a0f-45db-bb3a-595b19f3f627" by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:39 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Lock "309c9c26-4a0f-45db-bb3a-595b19f3f627" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:39 user nova-compute[70374]: DEBUG nova.objects.instance [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Lazy-loading 'flavor' on Instance uuid 309c9c26-4a0f-45db-bb3a-595b19f3f627 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:32:39 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Lock "309c9c26-4a0f-45db-bb3a-595b19f3f627" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: held 0.057s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:39 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Acquiring lock "309c9c26-4a0f-45db-bb3a-595b19f3f627" by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:39 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Lock "309c9c26-4a0f-45db-bb3a-595b19f3f627" acquired by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:39 user nova-compute[70374]: INFO nova.compute.manager [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Attaching volume 0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e to /dev/sdc Jul 27 09:32:39 user nova-compute[70374]: DEBUG os_brick.utils [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '10.0.0.210', 'multipath': False, 'enforce_multipath': True, 'host': 'user', 'execute': None}" {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:175}} Jul 27 09:32:39 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:39 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.014s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:39 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[6fd818fd-b708-47b1-b95b-031bfdb665a4]: (4, ('InitiatorName=iqn.2016-04.com.open-iscsi:ce3dd372cc44\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:39 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt / -n -o SOURCE {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:39 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "findmnt / -n -o SOURCE" returned: 0 in 0.018s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:39 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[b1f11d17-d521-4415-9de0-929e3e0659d3]: (4, ('/dev/mapper/vg0-lv--0\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:39 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): blkid /dev/mapper/vg0-lv--0 -s UUID -o value {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:39 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "blkid /dev/mapper/vg0-lv--0 -s UUID -o value" returned: 0 in 0.014s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:39 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[86d7f6bf-a235-498a-9b15-76ac98863e3f]: (4, ('e95b3b51-542d-42ca-ac40-f83360608668\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:39 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[85174f4a-dff3-40a1-a1b7-f942d5a4ae19]: (4, 'e20c3142-5af9-7467-ecd8-70b2e4a210d6') {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:39 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Running cmd (subprocess): nvme version {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:39 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] 'nvme version' failed. Not Retrying. {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:473}} Jul 27 09:32:39 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.nvmeof [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] nvme not present on system {{(pid=70374) nvme_present /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/nvmeof.py:757}} Jul 27 09:32:39 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): nvme show-hostnqn {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:39 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] 'nvme show-hostnqn' failed. Not Retrying. {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:473}} Jul 27 09:32:39 user nova-compute[70374]: WARNING os_brick.privileged.nvmeof [-] Could not generate host nqn: [Errno 2] No such file or directory: 'nvme' Jul 27 09:32:39 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[bf0603a4-b0a9-4df7-8b5f-3600ccd536f0]: (4, '') {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:39 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.lightos [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] LIGHTOS: [Errno 111] ECONNREFUSED {{(pid=70374) find_dsc /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/lightos.py:98}} Jul 27 09:32:39 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.lightos [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] LIGHTOS: did not find dsc, continuing anyway. {{(pid=70374) get_connector_properties /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/lightos.py:76}} Jul 27 09:32:39 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.lightos [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] LIGHTOS: no hostnqn found. {{(pid=70374) get_connector_properties /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/lightos.py:84}} Jul 27 09:32:39 user nova-compute[70374]: DEBUG os_brick.utils [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] <== get_connector_properties: return (204ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '10.0.0.210', 'host': 'user', 'multipath': False, 'initiator': 'iqn.2016-04.com.open-iscsi:ce3dd372cc44', 'do_local_attach': False, 'uuid': 'e95b3b51-542d-42ca-ac40-f83360608668', 'system uuid': 'e20c3142-5af9-7467-ecd8-70b2e4a210d6', 'nvme_native_multipath': False} {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:202}} Jul 27 09:32:39 user nova-compute[70374]: DEBUG nova.virt.block_device [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Updating existing volume attachment record: 4e610672-d79f-4134-b5c6-ecf3cea4031d {{(pid=70374) _volume_attach /opt/stack/nova/nova/virt/block_device.py:631}} Jul 27 09:32:40 user nova-compute[70374]: DEBUG nova.virt.libvirt.volume.iscsi [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Calling os-brick to attach iSCSI Volume {{(pid=70374) connect_volume /opt/stack/nova/nova/virt/libvirt/volume/iscsi.py:63}} Jul 27 09:32:40 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] ==> connect_volume: call "{'self': , 'connection_properties': {'target_iqn': 'iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c', 'target_portal': '172.16.0.220:3260', 'target_discovered': False, 'auth_method': 'CHAP', 'auth_username': 'UdaFcWLo', 'auth_password': '***', 'target_lun': 0, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False}}" {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:175}} Jul 27 09:32:40 user nova-compute[70374]: WARNING os_brick.initiator.connectors.base [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Service needs to call os_brick.setup() before connecting volumes, if it doesn't it will break on the next release Jul 27 09:32:40 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Acquiring lock "connect_volume" by "os_brick.initiator.connectors.iscsi.ISCSIConnector.connect_volume" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:68}} Jul 27 09:32:40 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Lock "connect_volume" acquired by "os_brick.initiator.connectors.iscsi.ISCSIConnector.connect_volume" :: waited 0.002s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:73}} Jul 27 09:32:40 user nova-compute[70374]: WARNING os_brick.initiator.connectors.base [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Service needs to call os_brick.setup() before connecting volumes, if it doesn't it will break on the next release Jul 27 09:32:40 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Acquiring lock "connect_to_iscsi_portal-172.16.0.220:3260-iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c" by "os_brick.initiator.connectors.iscsi.ISCSIConnector._connect_to_iscsi_portal_unsafe" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:68}} Jul 27 09:32:40 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Lock "connect_to_iscsi_portal-172.16.0.220:3260-iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c" acquired by "os_brick.initiator.connectors.iscsi.ISCSIConnector._connect_to_iscsi_portal_unsafe" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:73}} Jul 27 09:32:40 user nova-compute[70374]: INFO os_brick.initiator.connectors.iscsi [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Trying to connect to iSCSI portal 172.16.0.220:3260 Jul 27 09:32:40 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c -p 172.16.0.220:3260 {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:40 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c -p 172.16.0.220:3260" returned: 21 in 0.020s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:40 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[fef85fe4-4b2a-42df-8f39-6b6cb79c8d2c]: (4, ('', 'iscsiadm: No records found\n')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:40 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] iscsiadm (): stdout= stderr=iscsiadm: No records found Jul 27 09:32:40 user nova-compute[70374]: {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:40 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c -p 172.16.0.220:3260 --interface default --op new {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c -p 172.16.0.220:3260 --interface default --op new" returned: 0 in 0.021s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[6fecb278-0cc7-487b-88af-16d85d16e3a1]: (4, ('New iSCSI node [tcp:[hw=,ip=,net_if=,iscsi_if=default] 172.16.0.220,3260,-1 iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c] added\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] iscsiadm ('--interface', 'default', '--op', 'new'): stdout=New iSCSI node [tcp:[hw=,ip=,net_if=,iscsi_if=default] 172.16.0.220,3260,-1 iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c] added Jul 27 09:32:41 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c -p 172.16.0.220:3260 --op update -n node.session.scan -v manual {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c -p 172.16.0.220:3260 --op update -n node.session.scan -v manual" returned: 0 in 0.021s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[d15cb9e7-8b5e-4061-a9b3-9f4492356036]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] iscsiadm ('--op', 'update', '-n', 'node.session.scan', '-v', 'manual'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c -p 172.16.0.220:3260 --op update -n node.session.auth.authmethod -v CHAP {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c -p 172.16.0.220:3260 --op update -n node.session.auth.authmethod -v CHAP" returned: 0 in 0.026s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[9a0cdcea-2c24-4ff9-98c5-8241237e1b06]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] iscsiadm ('--op', 'update', '-n', 'node.session.auth.authmethod', '-v', 'CHAP'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c -p 172.16.0.220:3260 --op update -n node.session.auth.username -v UdaFcWLo {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c -p 172.16.0.220:3260 --op update -n node.session.auth.username -v UdaFcWLo" returned: 0 in 0.023s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[403f2747-ca7c-470f-bee5-d7fea1178b18]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] iscsiadm ('--op', 'update', '-n', 'node.session.auth.username', '-v', 'UdaFcWLo'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c -p 172.16.0.220:3260 --op update -n node.session.auth.password -v *** {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c -p 172.16.0.220:3260 --op update -n node.session.auth.password -v ***" returned: 0 in 0.021s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[61546b20-ffcf-47c5-974a-aea3c6947afb]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] iscsiadm ('--op', 'update', '-n', 'node.session.auth.password', '-v', '***'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m session {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m session" returned: 0 in 0.021s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[e48076d4-1027-41f3-bb7c-25360f3fe09f]: (4, ('tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash)\ntcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash)\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] iscsiadm ('-m', 'session'): stdout=tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash) Jul 27 09:32:41 user nova-compute[70374]: tcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash) Jul 27 09:32:41 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm_bare /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1182}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] iscsi session list stdout=tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash) Jul 27 09:32:41 user nova-compute[70374]: tcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash) Jul 27 09:32:41 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsi_session /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1171}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c -p 172.16.0.220:3260 --login {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c -p 172.16.0.220:3260 --login" returned: 0 in 0.057s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[d2497cca-8c17-4eba-bf47-c74743beda60]: (4, ('Logging in to [iface: default, target: iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c, portal: 172.16.0.220,3260]\nLogin to [iface: default, target: iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c, portal: 172.16.0.220,3260] successful.\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] iscsiadm ('--login',): stdout=Logging in to [iface: default, target: iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c, portal: 172.16.0.220,3260] Jul 27 09:32:41 user nova-compute[70374]: Login to [iface: default, target: iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c, portal: 172.16.0.220,3260] successful. Jul 27 09:32:41 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c -p 172.16.0.220:3260 --op update -n node.startup -v automatic {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c -p 172.16.0.220:3260 --op update -n node.startup -v automatic" returned: 0 in 0.018s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[6ab4b54b-df69-441d-8818-3f9c8c828c49]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] iscsiadm ('--op', 'update', '-n', 'node.startup', '-v', 'automatic'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m session {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m session" returned: 0 in 0.021s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[f2c20821-5cdd-4481-b408-670282adba93]: (4, ('tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash)\ntcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash)\ntcp: [5] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c (non-flash)\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] iscsiadm ('-m', 'session'): stdout=tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash) Jul 27 09:32:41 user nova-compute[70374]: tcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash) Jul 27 09:32:41 user nova-compute[70374]: tcp: [5] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c (non-flash) Jul 27 09:32:41 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm_bare /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1182}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] iscsi session list stdout=tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash) Jul 27 09:32:41 user nova-compute[70374]: tcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash) Jul 27 09:32:41 user nova-compute[70374]: tcp: [5] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c (non-flash) Jul 27 09:32:41 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsi_session /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1171}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Lock "connect_to_iscsi_portal-172.16.0.220:3260-iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c" "released" by "os_brick.initiator.connectors.iscsi.ISCSIConnector._connect_to_iscsi_portal_unsafe" :: held 0.293s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:87}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Connected to 172.16.0.220:3260 {{(pid=70374) _connect_vol /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:633}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] HCTL ('35', '-', '-', 0) found on session 5 with lun 0 {{(pid=70374) get_hctl /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:694}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Scanning host 35 c: -, t: -, l: 0) {{(pid=70374) scan_iscsi /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:719}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): tee -a /sys/class/scsi_host/host35/scan {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "tee -a /sys/class/scsi_host/host35/scan" returned: 0 in 0.017s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[cfdd57f4-2632-4c10-a5d4-32c931a874f5]: (4, ('- - 0', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Searching for a device in session 5 and hctl ['35', '*', '*', 0] yield: sdd {{(pid=70374) device_name_by_hctl /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:713}} Jul 27 09:32:41 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Connected to sdd using {'target_iqn': 'iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c', 'target_portal': '172.16.0.220:3260', 'target_discovered': False, 'auth_method': 'CHAP', 'auth_username': 'UdaFcWLo', 'auth_password': '***', 'target_lun': 0, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False} {{(pid=70374) _connect_vol /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:661}} Jul 27 09:32:42 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Lock "connect_volume" "released" by "os_brick.initiator.connectors.iscsi.ISCSIConnector.connect_volume" :: held 1.331s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:87}} Jul 27 09:32:42 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] <== connect_volume: return (1334ms) {'type': 'block', 'scsi_wwn': '23538393662353264', 'path': '/dev/sdd'} {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:202}} Jul 27 09:32:42 user nova-compute[70374]: DEBUG nova.virt.libvirt.volume.iscsi [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Attached iSCSI volume {'type': 'block', 'scsi_wwn': '23538393662353264', 'path': '/dev/sdd'} {{(pid=70374) connect_volume /opt/stack/nova/nova/virt/libvirt/volume/iscsi.py:65}} Jul 27 09:32:42 user nova-compute[70374]: DEBUG nova.objects.instance [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Lazy-loading 'flavor' on Instance uuid 42f4c546-47e4-485b-be29-4081c7557bad {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:32:42 user nova-compute[70374]: DEBUG nova.virt.libvirt.guest [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] attach device xml: Jul 27 09:32:42 user nova-compute[70374]: Jul 27 09:32:42 user nova-compute[70374]: Jul 27 09:32:42 user nova-compute[70374]: Jul 27 09:32:42 user nova-compute[70374]: 6ae2fa33-45a0-4d20-bbaf-a51427718f3c Jul 27 09:32:42 user nova-compute[70374]: Jul 27 09:32:42 user nova-compute[70374]: {{(pid=70374) attach_device /opt/stack/nova/nova/virt/libvirt/guest.py:339}} Jul 27 09:32:42 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] No BDM found with device name vda, not building metadata. {{(pid=70374) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Jul 27 09:32:42 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] No BDM found with device name vdb, not building metadata. {{(pid=70374) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Jul 27 09:32:42 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] No VIF found with MAC fa:16:3e:bf:c6:71, not building metadata {{(pid=70374) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Jul 27 09:32:42 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-86ba2a64-bf01-4ca3-9b86-44c81138ae4f tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Lock "42f4c546-47e4-485b-be29-4081c7557bad" "released" by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: held 9.718s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:43 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-d807cdab-6438-4c16-8664-0e49c4bb18c9 tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Acquiring lock "42f4c546-47e4-485b-be29-4081c7557bad" by "nova.compute.manager.ComputeManager.detach_volume..do_detach_volume" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-d807cdab-6438-4c16-8664-0e49c4bb18c9 tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Lock "42f4c546-47e4-485b-be29-4081c7557bad" acquired by "nova.compute.manager.ComputeManager.detach_volume..do_detach_volume" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:44 user nova-compute[70374]: INFO nova.compute.manager [None req-d807cdab-6438-4c16-8664-0e49c4bb18c9 tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Detaching volume 6ae2fa33-45a0-4d20-bbaf-a51427718f3c Jul 27 09:32:44 user nova-compute[70374]: DEBUG nova.compute.manager [req-1416af5b-28c8-4820-95ee-42b8ae36fc9b req-11009277-b2d2-4c47-8cc3-a171658aba6d service nova] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Received event network-changed-6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG nova.compute.manager [req-1416af5b-28c8-4820-95ee-42b8ae36fc9b req-11009277-b2d2-4c47-8cc3-a171658aba6d service nova] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Refreshing instance network info cache due to event network-changed-6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f. {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-1416af5b-28c8-4820-95ee-42b8ae36fc9b req-11009277-b2d2-4c47-8cc3-a171658aba6d service nova] Acquiring lock "refresh_cache-b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-1416af5b-28c8-4820-95ee-42b8ae36fc9b req-11009277-b2d2-4c47-8cc3-a171658aba6d service nova] Acquired lock "refresh_cache-b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG nova.network.neutron [req-1416af5b-28c8-4820-95ee-42b8ae36fc9b req-11009277-b2d2-4c47-8cc3-a171658aba6d service nova] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Refreshing network info cache for port 6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Jul 27 09:32:44 user nova-compute[70374]: INFO nova.virt.block_device [None req-d807cdab-6438-4c16-8664-0e49c4bb18c9 tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Attempting to driver detach volume 6ae2fa33-45a0-4d20-bbaf-a51427718f3c from mountpoint /dev/vdb Jul 27 09:32:44 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-d807cdab-6438-4c16-8664-0e49c4bb18c9 tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Attempting to detach device vdb from instance 42f4c546-47e4-485b-be29-4081c7557bad from the persistent domain config. {{(pid=70374) _detach_from_persistent /opt/stack/nova/nova/virt/libvirt/driver.py:2477}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG nova.virt.libvirt.guest [None req-d807cdab-6438-4c16-8664-0e49c4bb18c9 tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] detach device xml: Jul 27 09:32:44 user nova-compute[70374]: Jul 27 09:32:44 user nova-compute[70374]: Jul 27 09:32:44 user nova-compute[70374]: Jul 27 09:32:44 user nova-compute[70374]: 6ae2fa33-45a0-4d20-bbaf-a51427718f3c Jul 27 09:32:44 user nova-compute[70374]:
Jul 27 09:32:44 user nova-compute[70374]: Jul 27 09:32:44 user nova-compute[70374]: {{(pid=70374) detach_device /opt/stack/nova/nova/virt/libvirt/guest.py:465}} Jul 27 09:32:44 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-d807cdab-6438-4c16-8664-0e49c4bb18c9 tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Successfully detached device vdb from instance 42f4c546-47e4-485b-be29-4081c7557bad from the persistent domain config. Jul 27 09:32:44 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-d807cdab-6438-4c16-8664-0e49c4bb18c9 tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance 42f4c546-47e4-485b-be29-4081c7557bad from the live domain config. {{(pid=70374) _detach_from_live_with_retry /opt/stack/nova/nova/virt/libvirt/driver.py:2513}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG nova.virt.libvirt.guest [None req-d807cdab-6438-4c16-8664-0e49c4bb18c9 tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] detach device xml: Jul 27 09:32:44 user nova-compute[70374]: Jul 27 09:32:44 user nova-compute[70374]: Jul 27 09:32:44 user nova-compute[70374]: Jul 27 09:32:44 user nova-compute[70374]: 6ae2fa33-45a0-4d20-bbaf-a51427718f3c Jul 27 09:32:44 user nova-compute[70374]:
Jul 27 09:32:44 user nova-compute[70374]: Jul 27 09:32:44 user nova-compute[70374]: {{(pid=70374) detach_device /opt/stack/nova/nova/virt/libvirt/guest.py:465}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=70374) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Jul 27 09:32:44 user nova-compute[70374]: INFO nova.compute.manager [-] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] VM Stopped (Lifecycle Event) Jul 27 09:32:44 user nova-compute[70374]: DEBUG nova.compute.manager [None req-e8046713-baa3-4e76-a791-bd8d5b056ac3 None None] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Checking state {{(pid=70374) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG nova.compute.manager [None req-e8046713-baa3-4e76-a791-bd8d5b056ac3 None None] [instance: 25214e8a-c626-46a7-b273-eb491c2fc91b] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: stopped, current task_state: None, current DB power_state: 4, VM power_state: 4 {{(pid=70374) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1398}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG nova.network.neutron [req-1416af5b-28c8-4820-95ee-42b8ae36fc9b req-11009277-b2d2-4c47-8cc3-a171658aba6d service nova] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Updated VIF entry in instance network info cache for port 6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f. {{(pid=70374) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG nova.network.neutron [req-1416af5b-28c8-4820-95ee-42b8ae36fc9b req-11009277-b2d2-4c47-8cc3-a171658aba6d service nova] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Updating instance_info_cache with network_info: [{"id": "6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f", "address": "fa:16:3e:b2:9d:c3", "network": {"id": "158fe9d2-5b60-4b57-bdcb-10de0604f194", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-1646104715-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "df3e52a41c1847b199e6dcd09b676fba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6bd9ecfd-f7", "ovs_interfaceid": "6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-1416af5b-28c8-4820-95ee-42b8ae36fc9b req-11009277-b2d2-4c47-8cc3-a171658aba6d service nova] Releasing lock "refresh_cache-b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-2527503c-2b26-4355-945d-94cba18792bd tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Acquiring lock "interface-b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06-None" by "nova.compute.manager.ComputeManager.attach_interface..do_attach_interface" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-2527503c-2b26-4355-945d-94cba18792bd tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Lock "interface-b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06-None" acquired by "nova.compute.manager.ComputeManager.attach_interface..do_attach_interface" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG nova.objects.instance [None req-2527503c-2b26-4355-945d-94cba18792bd tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Lazy-loading 'flavor' on Instance uuid b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Received event virtio-disk1> from libvirt while the driver is waiting for it; dispatched. {{(pid=70374) emit_event /opt/stack/nova/nova/virt/libvirt/driver.py:2360}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-d807cdab-6438-4c16-8664-0e49c4bb18c9 tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance 42f4c546-47e4-485b-be29-4081c7557bad {{(pid=70374) _detach_from_live_and_wait_for_event /opt/stack/nova/nova/virt/libvirt/driver.py:2589}} Jul 27 09:32:44 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-d807cdab-6438-4c16-8664-0e49c4bb18c9 tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Successfully detached device vdb from instance 42f4c546-47e4-485b-be29-4081c7557bad from the live domain config. Jul 27 09:32:44 user nova-compute[70374]: DEBUG nova.virt.libvirt.volume.iscsi [None req-d807cdab-6438-4c16-8664-0e49c4bb18c9 tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] calling os-brick to detach iSCSI Volume {{(pid=70374) disconnect_volume /opt/stack/nova/nova/virt/libvirt/volume/iscsi.py:72}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-d807cdab-6438-4c16-8664-0e49c4bb18c9 tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] ==> disconnect_volume: call "{'args': (, {'target_iqn': 'iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c', 'target_portal': '172.16.0.220:3260', 'target_discovered': False, 'auth_method': 'CHAP', 'auth_username': 'UdaFcWLo', 'auth_password': '***', 'target_lun': 0, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False, 'device_path': '/dev/sdd'}, None), 'kwargs': False}}" {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:175}} Jul 27 09:32:44 user nova-compute[70374]: WARNING os_brick.initiator.connectors.base [None req-d807cdab-6438-4c16-8664-0e49c4bb18c9 tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Service needs to call os_brick.setup() before connecting volumes, if it doesn't it will break on the next release Jul 27 09:32:44 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-d807cdab-6438-4c16-8664-0e49c4bb18c9 tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Acquiring lock "connect_volume" by "os_brick.initiator.connectors.iscsi.ISCSIConnector.disconnect_volume" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:68}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-d807cdab-6438-4c16-8664-0e49c4bb18c9 tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Lock "connect_volume" acquired by "os_brick.initiator.connectors.iscsi.ISCSIConnector.disconnect_volume" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:73}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m discoverydb -o show -P 1 {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m discoverydb -o show -P 1" returned: 0 in 0.022s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[05ce08ba-2db1-47e3-ba95-89d9cd0fc9fb]: (4, ('SENDTARGETS:\nNo targets found.\niSNS:\nNo targets found.\nSTATIC:\nTarget: iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291\n\tPortal: 172.16.0.220:3260,-1\n\t\tIface Name: default\nTarget: iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251\n\tPortal: 172.16.0.220:3260,-1\n\t\tIface Name: default\nTarget: iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c\n\tPortal: 172.16.0.220:3260,-1\n\t\tIface Name: default\nFIRMWARE:\nNo targets found.\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-d807cdab-6438-4c16-8664-0e49c4bb18c9 tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] iscsiadm ['-m', 'discoverydb', '-o', 'show', '-P', 1]: stdout=SENDTARGETS: Jul 27 09:32:44 user nova-compute[70374]: No targets found. Jul 27 09:32:44 user nova-compute[70374]: iSNS: Jul 27 09:32:44 user nova-compute[70374]: No targets found. Jul 27 09:32:44 user nova-compute[70374]: STATIC: Jul 27 09:32:44 user nova-compute[70374]: Target: iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 Jul 27 09:32:44 user nova-compute[70374]: Portal: 172.16.0.220:3260,-1 Jul 27 09:32:44 user nova-compute[70374]: Iface Name: default Jul 27 09:32:44 user nova-compute[70374]: Target: iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 Jul 27 09:32:44 user nova-compute[70374]: Portal: 172.16.0.220:3260,-1 Jul 27 09:32:44 user nova-compute[70374]: Iface Name: default Jul 27 09:32:44 user nova-compute[70374]: Target: iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c Jul 27 09:32:44 user nova-compute[70374]: Portal: 172.16.0.220:3260,-1 Jul 27 09:32:44 user nova-compute[70374]: Iface Name: default Jul 27 09:32:44 user nova-compute[70374]: FIRMWARE: Jul 27 09:32:44 user nova-compute[70374]: No targets found. Jul 27 09:32:44 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm_bare /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1182}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-d807cdab-6438-4c16-8664-0e49c4bb18c9 tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Regex to get portals from discoverydb: ^SENDTARGETS: Jul 27 09:32:44 user nova-compute[70374]: .*?^DiscoveryAddress: 172.16.0.220,3260.*? Jul 27 09:32:44 user nova-compute[70374]: (.*?)^(?:DiscoveryAddress|iSNS):.* {{(pid=70374) _get_discoverydb_portals /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:371}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-d807cdab-6438-4c16-8664-0e49c4bb18c9 tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Getting connected devices for (ips,iqns,luns)=[('172.16.0.220:3260', 'iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c', 0)] {{(pid=70374) _get_connection_devices /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:819}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node" returned: 0 in 0.020s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[a645e966-4504-4474-bf1a-9204c489b82e]: (4, ('172.16.0.220:3260,4294967295 iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c\n172.16.0.220:3260,4294967295 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291\n172.16.0.220:3260,4294967295 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m session {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m session" returned: 0 in 0.023s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[f1c63b7c-c625-4094-b427-2fc6d35168c2]: (4, ('tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash)\ntcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash)\ntcp: [5] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c (non-flash)\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-d807cdab-6438-4c16-8664-0e49c4bb18c9 tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] iscsiadm ('-m', 'session'): stdout=tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash) Jul 27 09:32:44 user nova-compute[70374]: tcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash) Jul 27 09:32:44 user nova-compute[70374]: tcp: [5] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c (non-flash) Jul 27 09:32:44 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm_bare /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1182}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-d807cdab-6438-4c16-8664-0e49c4bb18c9 tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] iscsi session list stdout=tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash) Jul 27 09:32:44 user nova-compute[70374]: tcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash) Jul 27 09:32:44 user nova-compute[70374]: tcp: [5] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c (non-flash) Jul 27 09:32:44 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsi_session /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1171}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-d807cdab-6438-4c16-8664-0e49c4bb18c9 tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Resulting device map defaultdict(. at 0x7f27fc340160>, {('172.16.0.220:3260', 'iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c'): ({'sdd'}, set())}) {{(pid=70374) _get_connection_devices /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:852}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-d807cdab-6438-4c16-8664-0e49c4bb18c9 tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Removing single pathed devices sdd {{(pid=70374) remove_connection /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:309}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.019s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[2957cc04-5bd4-4e85-a3b7-218bfa521107]: (4, ('path checker states:\nup 3\n\npaths: 0\nbusy: False\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd del path /dev/sdd {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "multipathd del path /dev/sdd" returned: 0 in 0.019s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[5c859fe4-ab68-4f63-806b-bf900dbad44a]: (4, ('ok\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-d807cdab-6438-4c16-8664-0e49c4bb18c9 tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Flushing IO for device /dev/sdd {{(pid=70374) flush_device_io /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:369}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): blockdev --flushbufs /dev/sdd {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "blockdev --flushbufs /dev/sdd" returned: 0 in 0.017s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[41da5410-7c63-4cf1-a9b1-b7a15fbd68b5]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-d807cdab-6438-4c16-8664-0e49c4bb18c9 tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Remove SCSI device /dev/sdd with /sys/block/sdd/device/delete {{(pid=70374) remove_scsi_device /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:83}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): tee -a /sys/block/sdd/device/delete {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "tee -a /sys/block/sdd/device/delete" returned: 0 in 0.036s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[8d356395-58c5-4bb0-843b-5f20a60a10f8]: (4, ('1', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-d807cdab-6438-4c16-8664-0e49c4bb18c9 tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Checking to see if SCSI volumes sdd have been removed. {{(pid=70374) wait_for_volumes_removal /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:91}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-d807cdab-6438-4c16-8664-0e49c4bb18c9 tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] SCSI volumes sdd have been removed. {{(pid=70374) wait_for_volumes_removal /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:101}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-d807cdab-6438-4c16-8664-0e49c4bb18c9 tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Disconnecting from: [('172.16.0.220:3260', 'iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c')] {{(pid=70374) _disconnect_connection /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1160}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c -p 172.16.0.220:3260 --op update -n node.startup -v manual {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG nova.objects.instance [None req-2527503c-2b26-4355-945d-94cba18792bd tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Lazy-loading 'pci_requests' on Instance uuid b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c -p 172.16.0.220:3260 --op update -n node.startup -v manual" returned: 0 in 0.020s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[f65995dc-7e08-4d44-b460-dbfc21cc7ab1]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:44 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-d807cdab-6438-4c16-8664-0e49c4bb18c9 tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] iscsiadm ('--op', 'update', '-n', 'node.startup', '-v', 'manual'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:45 user nova-compute[70374]: DEBUG nova.network.neutron [None req-2527503c-2b26-4355-945d-94cba18792bd tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] allocate_for_instance() {{(pid=70374) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} Jul 27 09:32:45 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c -p 172.16.0.220:3260 --logout {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:45 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c -p 172.16.0.220:3260 --logout" returned: 0 in 0.076s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:45 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[567e5fb9-bbff-43a4-a18e-0efb1ac3187a]: (4, ('Logging out of session [sid: 5, target: iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c, portal: 172.16.0.220,3260]\nLogout of [sid: 5, target: iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c, portal: 172.16.0.220,3260] successful.\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:45 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-d807cdab-6438-4c16-8664-0e49c4bb18c9 tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] iscsiadm ('--logout',): stdout=Logging out of session [sid: 5, target: iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c, portal: 172.16.0.220,3260] Jul 27 09:32:45 user nova-compute[70374]: Logout of [sid: 5, target: iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c, portal: 172.16.0.220,3260] successful. Jul 27 09:32:45 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:45 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c -p 172.16.0.220:3260 --op delete {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:45 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:6ae2fa33-45a0-4d20-bbaf-a51427718f3c -p 172.16.0.220:3260 --op delete" returned: 0 in 0.024s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:45 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[d45b9504-27b1-4e97-a2e7-b0557bdddb38]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:45 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-d807cdab-6438-4c16-8664-0e49c4bb18c9 tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] iscsiadm ('--op', 'delete'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:45 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-d807cdab-6438-4c16-8664-0e49c4bb18c9 tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Lock "connect_volume" "released" by "os_brick.initiator.connectors.iscsi.ISCSIConnector.disconnect_volume" :: held 0.324s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:87}} Jul 27 09:32:45 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-d807cdab-6438-4c16-8664-0e49c4bb18c9 tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] <== disconnect_volume: return (325ms) None {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:202}} Jul 27 09:32:45 user nova-compute[70374]: DEBUG nova.virt.libvirt.volume.iscsi [None req-d807cdab-6438-4c16-8664-0e49c4bb18c9 tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] [instance: 42f4c546-47e4-485b-be29-4081c7557bad] Disconnected iSCSI Volume {{(pid=70374) disconnect_volume /opt/stack/nova/nova/virt/libvirt/volume/iscsi.py:79}} Jul 27 09:32:45 user nova-compute[70374]: DEBUG nova.policy [None req-2527503c-2b26-4355-945d-94cba18792bd tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6d33f8cd041046c18af25f56b63b6bb5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'df3e52a41c1847b199e6dcd09b676fba', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=70374) authorize /opt/stack/nova/nova/policy.py:203}} Jul 27 09:32:45 user nova-compute[70374]: DEBUG nova.network.neutron [None req-2527503c-2b26-4355-945d-94cba18792bd tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Successfully created port: e86f785e-63db-4bd0-92b9-6b813174c7a5 {{(pid=70374) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} Jul 27 09:32:45 user nova-compute[70374]: DEBUG nova.objects.instance [None req-d807cdab-6438-4c16-8664-0e49c4bb18c9 tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Lazy-loading 'flavor' on Instance uuid 42f4c546-47e4-485b-be29-4081c7557bad {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-d807cdab-6438-4c16-8664-0e49c4bb18c9 tempest-VolumesAdminNegativeTest-1398001736 tempest-VolumesAdminNegativeTest-1398001736-project-member] Lock "42f4c546-47e4-485b-be29-4081c7557bad" "released" by "nova.compute.manager.ComputeManager.detach_volume..do_detach_volume" :: held 1.952s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG nova.virt.libvirt.volume.iscsi [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Calling os-brick to attach iSCSI Volume {{(pid=70374) connect_volume /opt/stack/nova/nova/virt/libvirt/volume/iscsi.py:63}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] ==> connect_volume: call "{'self': , 'connection_properties': {'target_iqn': 'iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac', 'target_portal': '172.16.0.220:3260', 'target_discovered': False, 'auth_method': 'CHAP', 'auth_username': 'KUpudWeL', 'auth_password': '***', 'target_lun': 0, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False}}" {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:175}} Jul 27 09:32:46 user nova-compute[70374]: WARNING os_brick.initiator.connectors.base [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Service needs to call os_brick.setup() before connecting volumes, if it doesn't it will break on the next release Jul 27 09:32:46 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Acquiring lock "connect_volume" by "os_brick.initiator.connectors.iscsi.ISCSIConnector.connect_volume" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:68}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Lock "connect_volume" acquired by "os_brick.initiator.connectors.iscsi.ISCSIConnector.connect_volume" :: waited 0.002s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:73}} Jul 27 09:32:46 user nova-compute[70374]: WARNING os_brick.initiator.connectors.base [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Service needs to call os_brick.setup() before connecting volumes, if it doesn't it will break on the next release Jul 27 09:32:46 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Acquiring lock "connect_to_iscsi_portal-172.16.0.220:3260-iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac" by "os_brick.initiator.connectors.iscsi.ISCSIConnector._connect_to_iscsi_portal_unsafe" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:68}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Lock "connect_to_iscsi_portal-172.16.0.220:3260-iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac" acquired by "os_brick.initiator.connectors.iscsi.ISCSIConnector._connect_to_iscsi_portal_unsafe" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:73}} Jul 27 09:32:46 user nova-compute[70374]: INFO os_brick.initiator.connectors.iscsi [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Trying to connect to iSCSI portal 172.16.0.220:3260 Jul 27 09:32:46 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac -p 172.16.0.220:3260 {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac -p 172.16.0.220:3260" returned: 21 in 0.017s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[96d868e0-82f6-4110-87f3-96fc449109c1]: (4, ('', 'iscsiadm: No records found\n')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] iscsiadm (): stdout= stderr=iscsiadm: No records found Jul 27 09:32:46 user nova-compute[70374]: {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac -p 172.16.0.220:3260 --interface default --op new {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac -p 172.16.0.220:3260 --interface default --op new" returned: 0 in 0.020s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[24cee924-047f-4aa7-9730-12d72abcfcce]: (4, ('New iSCSI node [tcp:[hw=,ip=,net_if=,iscsi_if=default] 172.16.0.220,3260,-1 iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac] added\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] iscsiadm ('--interface', 'default', '--op', 'new'): stdout=New iSCSI node [tcp:[hw=,ip=,net_if=,iscsi_if=default] 172.16.0.220,3260,-1 iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac] added Jul 27 09:32:46 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac -p 172.16.0.220:3260 --op update -n node.session.scan -v manual {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac -p 172.16.0.220:3260 --op update -n node.session.scan -v manual" returned: 0 in 0.017s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[7be4a382-8eee-49ed-9b3b-c413a7ad83e7]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] iscsiadm ('--op', 'update', '-n', 'node.session.scan', '-v', 'manual'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac -p 172.16.0.220:3260 --op update -n node.session.auth.authmethod -v CHAP {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac -p 172.16.0.220:3260 --op update -n node.session.auth.authmethod -v CHAP" returned: 0 in 0.017s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[59d76361-b43e-4710-ab55-d1d0c8e5ee4e]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] iscsiadm ('--op', 'update', '-n', 'node.session.auth.authmethod', '-v', 'CHAP'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac -p 172.16.0.220:3260 --op update -n node.session.auth.username -v KUpudWeL {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac -p 172.16.0.220:3260 --op update -n node.session.auth.username -v KUpudWeL" returned: 0 in 0.019s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[e2ed9a69-e166-4090-9c7c-202ec56e5127]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] iscsiadm ('--op', 'update', '-n', 'node.session.auth.username', '-v', 'KUpudWeL'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac -p 172.16.0.220:3260 --op update -n node.session.auth.password -v *** {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac -p 172.16.0.220:3260 --op update -n node.session.auth.password -v ***" returned: 0 in 0.021s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[0515c333-a314-4c69-962c-5fbdee2f26be]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] iscsiadm ('--op', 'update', '-n', 'node.session.auth.password', '-v', '***'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m session {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m session" returned: 0 in 0.020s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[2a1f2ba0-7920-4f4f-839f-b2dfc00328c4]: (4, ('tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash)\ntcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash)\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] iscsiadm ('-m', 'session'): stdout=tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash) Jul 27 09:32:46 user nova-compute[70374]: tcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash) Jul 27 09:32:46 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm_bare /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1182}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] iscsi session list stdout=tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash) Jul 27 09:32:46 user nova-compute[70374]: tcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash) Jul 27 09:32:46 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsi_session /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1171}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac -p 172.16.0.220:3260 --login {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac -p 172.16.0.220:3260 --login" returned: 0 in 0.062s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[8d96ce06-8405-48f6-b2aa-1df59504e9c4]: (4, ('Logging in to [iface: default, target: iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac, portal: 172.16.0.220,3260]\nLogin to [iface: default, target: iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac, portal: 172.16.0.220,3260] successful.\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] iscsiadm ('--login',): stdout=Logging in to [iface: default, target: iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac, portal: 172.16.0.220,3260] Jul 27 09:32:46 user nova-compute[70374]: Login to [iface: default, target: iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac, portal: 172.16.0.220,3260] successful. Jul 27 09:32:46 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac -p 172.16.0.220:3260 --op update -n node.startup -v automatic {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac -p 172.16.0.220:3260 --op update -n node.startup -v automatic" returned: 0 in 0.018s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[45b32679-4142-497d-ad83-f996fc7cc2f7]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] iscsiadm ('--op', 'update', '-n', 'node.startup', '-v', 'automatic'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m session {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m session" returned: 0 in 0.025s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[264eae0d-009e-40a6-983d-349a759f89ab]: (4, ('tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash)\ntcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash)\ntcp: [6] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac (non-flash)\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] iscsiadm ('-m', 'session'): stdout=tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash) Jul 27 09:32:46 user nova-compute[70374]: tcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash) Jul 27 09:32:46 user nova-compute[70374]: tcp: [6] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac (non-flash) Jul 27 09:32:46 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm_bare /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1182}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] iscsi session list stdout=tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash) Jul 27 09:32:46 user nova-compute[70374]: tcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash) Jul 27 09:32:46 user nova-compute[70374]: tcp: [6] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac (non-flash) Jul 27 09:32:46 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsi_session /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1171}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Lock "connect_to_iscsi_portal-172.16.0.220:3260-iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac" "released" by "os_brick.initiator.connectors.iscsi.ISCSIConnector._connect_to_iscsi_portal_unsafe" :: held 0.273s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:87}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Connected to 172.16.0.220:3260 {{(pid=70374) _connect_vol /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:633}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] HCTL ('35', '-', '-', 0) found on session 6 with lun 0 {{(pid=70374) get_hctl /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:694}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Scanning host 35 c: -, t: -, l: 0) {{(pid=70374) scan_iscsi /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:719}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): tee -a /sys/class/scsi_host/host35/scan {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "tee -a /sys/class/scsi_host/host35/scan" returned: 0 in 0.022s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[1377a1c6-e403-4088-8754-3518601c6c2d]: (4, ('- - 0', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Searching for a device in session 6 and hctl ['35', '*', '*', 0] yield: sdd {{(pid=70374) device_name_by_hctl /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:713}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Connected to sdd using {'target_iqn': 'iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac', 'target_portal': '172.16.0.220:3260', 'target_discovered': False, 'auth_method': 'CHAP', 'auth_username': 'KUpudWeL', 'auth_password': '***', 'target_lun': 0, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False} {{(pid=70374) _connect_vol /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:661}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG nova.network.neutron [None req-2527503c-2b26-4355-945d-94cba18792bd tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Successfully updated port: e86f785e-63db-4bd0-92b9-6b813174c7a5 {{(pid=70374) _update_port /opt/stack/nova/nova/network/neutron.py:586}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-2527503c-2b26-4355-945d-94cba18792bd tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Acquiring lock "refresh_cache-b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-2527503c-2b26-4355-945d-94cba18792bd tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Acquired lock "refresh_cache-b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG nova.network.neutron [None req-2527503c-2b26-4355-945d-94cba18792bd tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Building network info cache for instance {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2002}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG nova.compute.manager [req-27aed475-3713-4f3d-8447-1f7fbcf75e85 req-4da4ed27-c1e1-49e2-b178-7aeb99a8a2bf service nova] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Received event network-changed-e86f785e-63db-4bd0-92b9-6b813174c7a5 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG nova.compute.manager [req-27aed475-3713-4f3d-8447-1f7fbcf75e85 req-4da4ed27-c1e1-49e2-b178-7aeb99a8a2bf service nova] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Refreshing instance network info cache due to event network-changed-e86f785e-63db-4bd0-92b9-6b813174c7a5. {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11037}} Jul 27 09:32:46 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-27aed475-3713-4f3d-8447-1f7fbcf75e85 req-4da4ed27-c1e1-49e2-b178-7aeb99a8a2bf service nova] Acquiring lock "refresh_cache-b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG nova.virt.libvirt.volume.iscsi [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Calling os-brick to attach iSCSI Volume {{(pid=70374) connect_volume /opt/stack/nova/nova/virt/libvirt/volume/iscsi.py:63}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] ==> connect_volume: call "{'self': , 'connection_properties': {'target_iqn': 'iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e', 'target_portal': '172.16.0.220:3260', 'target_discovered': False, 'auth_method': 'CHAP', 'auth_username': 'NYfpjgzV', 'auth_password': '***', 'target_lun': 0, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False}}" {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:175}} Jul 27 09:32:47 user nova-compute[70374]: WARNING os_brick.initiator.connectors.base [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Service needs to call os_brick.setup() before connecting volumes, if it doesn't it will break on the next release Jul 27 09:32:47 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Acquiring lock "connect_volume" by "os_brick.initiator.connectors.iscsi.ISCSIConnector.connect_volume" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:68}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG nova.network.neutron [None req-2527503c-2b26-4355-945d-94cba18792bd tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Updating instance_info_cache with network_info: [{"id": "6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f", "address": "fa:16:3e:b2:9d:c3", "network": {"id": "158fe9d2-5b60-4b57-bdcb-10de0604f194", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-1646104715-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "df3e52a41c1847b199e6dcd09b676fba", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6bd9ecfd-f7", "ovs_interfaceid": "6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e86f785e-63db-4bd0-92b9-6b813174c7a5", "address": "fa:16:3e:f8:a9:cb", "network": {"id": "f4c65a93-da7c-42e7-9c5e-c721d1ecb59e", "bridge": "br-int", "label": "tempest-tagged-attachments-test-net-193407618", "subnets": [{"cidr": "10.10.10.0/24", "dns": [], "gateway": {"address": "10.10.10.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.10.80", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.10.10.1"}}], "meta": {"injected": false, "tenant_id": "df3e52a41c1847b199e6dcd09b676fba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape86f785e-63", "ovs_interfaceid": "e86f785e-63db-4bd0-92b9-6b813174c7a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-2527503c-2b26-4355-945d-94cba18792bd tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Releasing lock "refresh_cache-b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-27aed475-3713-4f3d-8447-1f7fbcf75e85 req-4da4ed27-c1e1-49e2-b178-7aeb99a8a2bf service nova] Acquired lock "refresh_cache-b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG nova.network.neutron [req-27aed475-3713-4f3d-8447-1f7fbcf75e85 req-4da4ed27-c1e1-49e2-b178-7aeb99a8a2bf service nova] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Refreshing network info cache for port e86f785e-63db-4bd0-92b9-6b813174c7a5 {{(pid=70374) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1999}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG nova.virt.libvirt.vif [None req-2527503c-2b26-4355-945d-94cba18792bd tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2023-07-27T09:30:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-175106609',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(6),hidden=False,host='user',hostname='tempest-device-tagging-server-175106609',id=11,image_ref='35458adf-261a-4e0b-a4db-b243619b2394',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOTnqzjorB039ilJl8VK7YcSm6BfDbMAMeMlN6KuNuE2BR5Ci7E8V9F4eHRzTVNR5ErMNNWpJQddk0yLJsVN++T9e5hUTfZ9niEsaZXI1P72KyNTxQBI7EWBQg10Ylojmg==',key_name='tempest-keypair-693359237',keypairs=,launch_index=0,launched_at=2023-07-27T09:31:03Z,launched_on='user',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='df3e52a41c1847b199e6dcd09b676fba',ramdisk_id='',reservation_id='r-tordp007',resources=,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='35458adf-261a-4e0b-a4db-b243619b2394',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-TaggedAttachmentsTest-493274998',owner_user_name='tempest-TaggedAttachmentsTest-493274998-project-member'},tags=,task_state=None,terminated_at=None,trusted_certs=,updated_at=2023-07-27T09:31:03Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6d33f8cd041046c18af25f56b63b6bb5',uuid=b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e86f785e-63db-4bd0-92b9-6b813174c7a5", "address": "fa:16:3e:f8:a9:cb", "network": {"id": "f4c65a93-da7c-42e7-9c5e-c721d1ecb59e", "bridge": "br-int", "label": "tempest-tagged-attachments-test-net-193407618", "subnets": [{"cidr": "10.10.10.0/24", "dns": [], "gateway": {"address": "10.10.10.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.10.80", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.10.10.1"}}], "meta": {"injected": false, "tenant_id": "df3e52a41c1847b199e6dcd09b676fba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape86f785e-63", "ovs_interfaceid": "e86f785e-63db-4bd0-92b9-6b813174c7a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-2527503c-2b26-4355-945d-94cba18792bd tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Converting VIF {"id": "e86f785e-63db-4bd0-92b9-6b813174c7a5", "address": "fa:16:3e:f8:a9:cb", "network": {"id": "f4c65a93-da7c-42e7-9c5e-c721d1ecb59e", "bridge": "br-int", "label": "tempest-tagged-attachments-test-net-193407618", "subnets": [{"cidr": "10.10.10.0/24", "dns": [], "gateway": {"address": "10.10.10.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.10.80", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.10.10.1"}}], "meta": {"injected": false, "tenant_id": "df3e52a41c1847b199e6dcd09b676fba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape86f785e-63", "ovs_interfaceid": "e86f785e-63db-4bd0-92b9-6b813174c7a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-2527503c-2b26-4355-945d-94cba18792bd tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f8:a9:cb,bridge_name='br-int',has_traffic_filtering=True,id=e86f785e-63db-4bd0-92b9-6b813174c7a5,network=Network(f4c65a93-da7c-42e7-9c5e-c721d1ecb59e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape86f785e-63') {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG os_vif [None req-2527503c-2b26-4355-945d-94cba18792bd tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:a9:cb,bridge_name='br-int',has_traffic_filtering=True,id=e86f785e-63db-4bd0-92b9-6b813174c7a5,network=Network(f4c65a93-da7c-42e7-9c5e-c721d1ecb59e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape86f785e-63') {{(pid=70374) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape86f785e-63, may_exist=True, interface_attrs={}) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape86f785e-63, col_values=(('external_ids', {'iface-id': 'e86f785e-63db-4bd0-92b9-6b813174c7a5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f8:a9:cb', 'vm-uuid': 'b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06'}),), if_exists=True) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:47 user nova-compute[70374]: INFO os_vif [None req-2527503c-2b26-4355-945d-94cba18792bd tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:a9:cb,bridge_name='br-int',has_traffic_filtering=True,id=e86f785e-63db-4bd0-92b9-6b813174c7a5,network=Network(f4c65a93-da7c-42e7-9c5e-c721d1ecb59e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape86f785e-63') Jul 27 09:32:47 user nova-compute[70374]: DEBUG nova.virt.libvirt.vif [None req-2527503c-2b26-4355-945d-94cba18792bd tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2023-07-27T09:30:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-175106609',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(6),hidden=False,host='user',hostname='tempest-device-tagging-server-175106609',id=11,image_ref='35458adf-261a-4e0b-a4db-b243619b2394',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOTnqzjorB039ilJl8VK7YcSm6BfDbMAMeMlN6KuNuE2BR5Ci7E8V9F4eHRzTVNR5ErMNNWpJQddk0yLJsVN++T9e5hUTfZ9niEsaZXI1P72KyNTxQBI7EWBQg10Ylojmg==',key_name='tempest-keypair-693359237',keypairs=,launch_index=0,launched_at=2023-07-27T09:31:03Z,launched_on='user',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='df3e52a41c1847b199e6dcd09b676fba',ramdisk_id='',reservation_id='r-tordp007',resources=,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='35458adf-261a-4e0b-a4db-b243619b2394',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-TaggedAttachmentsTest-493274998',owner_user_name='tempest-TaggedAttachmentsTest-493274998-project-member'},tags=,task_state=None,terminated_at=None,trusted_certs=,updated_at=2023-07-27T09:31:03Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6d33f8cd041046c18af25f56b63b6bb5',uuid=b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e86f785e-63db-4bd0-92b9-6b813174c7a5", "address": "fa:16:3e:f8:a9:cb", "network": {"id": "f4c65a93-da7c-42e7-9c5e-c721d1ecb59e", "bridge": "br-int", "label": "tempest-tagged-attachments-test-net-193407618", "subnets": [{"cidr": "10.10.10.0/24", "dns": [], "gateway": {"address": "10.10.10.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.10.80", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.10.10.1"}}], "meta": {"injected": false, "tenant_id": "df3e52a41c1847b199e6dcd09b676fba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape86f785e-63", "ovs_interfaceid": "e86f785e-63db-4bd0-92b9-6b813174c7a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70374) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-2527503c-2b26-4355-945d-94cba18792bd tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Converting VIF {"id": "e86f785e-63db-4bd0-92b9-6b813174c7a5", "address": "fa:16:3e:f8:a9:cb", "network": {"id": "f4c65a93-da7c-42e7-9c5e-c721d1ecb59e", "bridge": "br-int", "label": "tempest-tagged-attachments-test-net-193407618", "subnets": [{"cidr": "10.10.10.0/24", "dns": [], "gateway": {"address": "10.10.10.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.10.80", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.10.10.1"}}], "meta": {"injected": false, "tenant_id": "df3e52a41c1847b199e6dcd09b676fba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape86f785e-63", "ovs_interfaceid": "e86f785e-63db-4bd0-92b9-6b813174c7a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-2527503c-2b26-4355-945d-94cba18792bd tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f8:a9:cb,bridge_name='br-int',has_traffic_filtering=True,id=e86f785e-63db-4bd0-92b9-6b813174c7a5,network=Network(f4c65a93-da7c-42e7-9c5e-c721d1ecb59e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape86f785e-63') {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG nova.virt.libvirt.guest [None req-2527503c-2b26-4355-945d-94cba18792bd tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] attach device xml: Jul 27 09:32:47 user nova-compute[70374]: Jul 27 09:32:47 user nova-compute[70374]: Jul 27 09:32:47 user nova-compute[70374]: Jul 27 09:32:47 user nova-compute[70374]: Jul 27 09:32:47 user nova-compute[70374]: Jul 27 09:32:47 user nova-compute[70374]: {{(pid=70374) attach_device /opt/stack/nova/nova/virt/libvirt/guest.py:339}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Lock "connect_volume" "released" by "os_brick.initiator.connectors.iscsi.ISCSIConnector.connect_volume" :: held 1.311s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:87}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] <== connect_volume: return (1313ms) {'type': 'block', 'scsi_wwn': '23065313130373465', 'path': '/dev/sdd'} {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:202}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG nova.virt.libvirt.volume.iscsi [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Attached iSCSI volume {'type': 'block', 'scsi_wwn': '23065313130373465', 'path': '/dev/sdd'} {{(pid=70374) connect_volume /opt/stack/nova/nova/virt/libvirt/volume/iscsi.py:65}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Lock "connect_volume" acquired by "os_brick.initiator.connectors.iscsi.ISCSIConnector.connect_volume" :: waited 0.363s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:73}} Jul 27 09:32:47 user nova-compute[70374]: WARNING os_brick.initiator.connectors.base [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Service needs to call os_brick.setup() before connecting volumes, if it doesn't it will break on the next release Jul 27 09:32:47 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Acquiring lock "connect_to_iscsi_portal-172.16.0.220:3260-iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e" by "os_brick.initiator.connectors.iscsi.ISCSIConnector._connect_to_iscsi_portal_unsafe" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:68}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Lock "connect_to_iscsi_portal-172.16.0.220:3260-iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e" acquired by "os_brick.initiator.connectors.iscsi.ISCSIConnector._connect_to_iscsi_portal_unsafe" :: waited 0.002s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:73}} Jul 27 09:32:47 user nova-compute[70374]: INFO os_brick.initiator.connectors.iscsi [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Trying to connect to iSCSI portal 172.16.0.220:3260 Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e -p 172.16.0.220:3260 {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG nova.objects.instance [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Lazy-loading 'flavor' on Instance uuid d35fe056-8279-479a-a673-6c61e5ec6933 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e -p 172.16.0.220:3260" returned: 21 in 0.024s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[4b077ae6-d439-41f8-aa66-7e6823ca80d2]: (4, ('', 'iscsiadm: No records found\n')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] iscsiadm (): stdout= stderr=iscsiadm: No records found Jul 27 09:32:47 user nova-compute[70374]: {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e -p 172.16.0.220:3260 --interface default --op new {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG nova.virt.libvirt.guest [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] attach device xml: Jul 27 09:32:47 user nova-compute[70374]: Jul 27 09:32:47 user nova-compute[70374]: Jul 27 09:32:47 user nova-compute[70374]: Jul 27 09:32:47 user nova-compute[70374]: 25fa94b4-c6dc-4dbd-add9-e3a58523baac Jul 27 09:32:47 user nova-compute[70374]: Jul 27 09:32:47 user nova-compute[70374]: {{(pid=70374) attach_device /opt/stack/nova/nova/virt/libvirt/guest.py:339}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e -p 172.16.0.220:3260 --interface default --op new" returned: 0 in 0.027s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[da9ae4dd-5444-4830-adc1-455f3e9217db]: (4, ('New iSCSI node [tcp:[hw=,ip=,net_if=,iscsi_if=default] 172.16.0.220,3260,-1 iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e] added\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] iscsiadm ('--interface', 'default', '--op', 'new'): stdout=New iSCSI node [tcp:[hw=,ip=,net_if=,iscsi_if=default] 172.16.0.220,3260,-1 iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e] added Jul 27 09:32:47 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e -p 172.16.0.220:3260 --op update -n node.session.scan -v manual {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e -p 172.16.0.220:3260 --op update -n node.session.scan -v manual" returned: 0 in 0.019s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[2df39f22-146a-4bd5-85a8-ed86a261bac6]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] iscsiadm ('--op', 'update', '-n', 'node.session.scan', '-v', 'manual'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e -p 172.16.0.220:3260 --op update -n node.session.auth.authmethod -v CHAP {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e -p 172.16.0.220:3260 --op update -n node.session.auth.authmethod -v CHAP" returned: 0 in 0.019s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[28809ec7-5eca-4160-872d-c79f00d1ff9f]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] iscsiadm ('--op', 'update', '-n', 'node.session.auth.authmethod', '-v', 'CHAP'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e -p 172.16.0.220:3260 --op update -n node.session.auth.username -v NYfpjgzV {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e -p 172.16.0.220:3260 --op update -n node.session.auth.username -v NYfpjgzV" returned: 0 in 0.025s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[0120463f-a106-4222-be11-5c13df567504]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] iscsiadm ('--op', 'update', '-n', 'node.session.auth.username', '-v', 'NYfpjgzV'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e -p 172.16.0.220:3260 --op update -n node.session.auth.password -v *** {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e -p 172.16.0.220:3260 --op update -n node.session.auth.password -v ***" returned: 0 in 0.019s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[c7f7f689-8551-4672-9445-16add95da657]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] iscsiadm ('--op', 'update', '-n', 'node.session.auth.password', '-v', '***'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m session {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m session" returned: 0 in 0.023s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[d38ba3a1-852f-4654-81dc-6a088e53d2d8]: (4, ('tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash)\ntcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash)\ntcp: [6] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac (non-flash)\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] iscsiadm ('-m', 'session'): stdout=tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash) Jul 27 09:32:47 user nova-compute[70374]: tcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash) Jul 27 09:32:47 user nova-compute[70374]: tcp: [6] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac (non-flash) Jul 27 09:32:47 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm_bare /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1182}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] iscsi session list stdout=tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash) Jul 27 09:32:47 user nova-compute[70374]: tcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash) Jul 27 09:32:47 user nova-compute[70374]: tcp: [6] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac (non-flash) Jul 27 09:32:47 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsi_session /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1171}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e -p 172.16.0.220:3260 --login {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e -p 172.16.0.220:3260 --login" returned: 0 in 0.083s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[a863fbd0-9449-4ff2-92e6-40954153c5c7]: (4, ('Logging in to [iface: default, target: iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e, portal: 172.16.0.220,3260]\nLogin to [iface: default, target: iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e, portal: 172.16.0.220,3260] successful.\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] iscsiadm ('--login',): stdout=Logging in to [iface: default, target: iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e, portal: 172.16.0.220,3260] Jul 27 09:32:47 user nova-compute[70374]: Login to [iface: default, target: iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e, portal: 172.16.0.220,3260] successful. Jul 27 09:32:47 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e -p 172.16.0.220:3260 --op update -n node.startup -v automatic {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG nova.compute.manager [req-59e9bcab-3ac4-4b37-819e-8608fe60cff4 req-fe6c1b0b-205a-48c6-8055-0bd70b44e3b2 service nova] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Received event network-vif-plugged-e86f785e-63db-4bd0-92b9-6b813174c7a5 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-59e9bcab-3ac4-4b37-819e-8608fe60cff4 req-fe6c1b0b-205a-48c6-8055-0bd70b44e3b2 service nova] Acquiring lock "b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-59e9bcab-3ac4-4b37-819e-8608fe60cff4 req-fe6c1b0b-205a-48c6-8055-0bd70b44e3b2 service nova] Lock "b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-59e9bcab-3ac4-4b37-819e-8608fe60cff4 req-fe6c1b0b-205a-48c6-8055-0bd70b44e3b2 service nova] Lock "b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG nova.compute.manager [req-59e9bcab-3ac4-4b37-819e-8608fe60cff4 req-fe6c1b0b-205a-48c6-8055-0bd70b44e3b2 service nova] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] No waiting events found dispatching network-vif-plugged-e86f785e-63db-4bd0-92b9-6b813174c7a5 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:32:47 user nova-compute[70374]: WARNING nova.compute.manager [req-59e9bcab-3ac4-4b37-819e-8608fe60cff4 req-fe6c1b0b-205a-48c6-8055-0bd70b44e3b2 service nova] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Received unexpected event network-vif-plugged-e86f785e-63db-4bd0-92b9-6b813174c7a5 for instance with vm_state active and task_state None. Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e -p 172.16.0.220:3260 --op update -n node.startup -v automatic" returned: 0 in 0.027s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[3fc01f1f-40e6-442c-b15b-7bd800e0f0ae]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] iscsiadm ('--op', 'update', '-n', 'node.startup', '-v', 'automatic'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m session {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m session" returned: 0 in 0.023s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[2c6b8983-a647-4291-8d4f-f0adf7969a48]: (4, ('tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash)\ntcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash)\ntcp: [6] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac (non-flash)\ntcp: [7] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e (non-flash)\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] iscsiadm ('-m', 'session'): stdout=tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash) Jul 27 09:32:47 user nova-compute[70374]: tcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash) Jul 27 09:32:47 user nova-compute[70374]: tcp: [6] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac (non-flash) Jul 27 09:32:47 user nova-compute[70374]: tcp: [7] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e (non-flash) Jul 27 09:32:47 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm_bare /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1182}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] iscsi session list stdout=tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash) Jul 27 09:32:47 user nova-compute[70374]: tcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash) Jul 27 09:32:47 user nova-compute[70374]: tcp: [6] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac (non-flash) Jul 27 09:32:47 user nova-compute[70374]: tcp: [7] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e (non-flash) Jul 27 09:32:47 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsi_session /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1171}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Lock "connect_to_iscsi_portal-172.16.0.220:3260-iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e" "released" by "os_brick.initiator.connectors.iscsi.ISCSIConnector._connect_to_iscsi_portal_unsafe" :: held 0.344s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:87}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Connected to 172.16.0.220:3260 {{(pid=70374) _connect_vol /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:633}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] HCTL ('36', '-', '-', 0) found on session 7 with lun 0 {{(pid=70374) get_hctl /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:694}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Scanning host 36 c: -, t: -, l: 0) {{(pid=70374) scan_iscsi /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:719}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG nova.network.neutron [req-27aed475-3713-4f3d-8447-1f7fbcf75e85 req-4da4ed27-c1e1-49e2-b178-7aeb99a8a2bf service nova] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Updated VIF entry in instance network info cache for port e86f785e-63db-4bd0-92b9-6b813174c7a5. {{(pid=70374) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3474}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG nova.network.neutron [req-27aed475-3713-4f3d-8447-1f7fbcf75e85 req-4da4ed27-c1e1-49e2-b178-7aeb99a8a2bf service nova] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Updating instance_info_cache with network_info: [{"id": "6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f", "address": "fa:16:3e:b2:9d:c3", "network": {"id": "158fe9d2-5b60-4b57-bdcb-10de0604f194", "bridge": "br-int", "label": "tempest-TaggedAttachmentsTest-1646104715-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "df3e52a41c1847b199e6dcd09b676fba", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6bd9ecfd-f7", "ovs_interfaceid": "6bd9ecfd-f75a-4f30-9be8-b4dd315cb41f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e86f785e-63db-4bd0-92b9-6b813174c7a5", "address": "fa:16:3e:f8:a9:cb", "network": {"id": "f4c65a93-da7c-42e7-9c5e-c721d1ecb59e", "bridge": "br-int", "label": "tempest-tagged-attachments-test-net-193407618", "subnets": [{"cidr": "10.10.10.0/24", "dns": [], "gateway": {"address": "10.10.10.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.10.80", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.10.10.1"}}], "meta": {"injected": false, "tenant_id": "df3e52a41c1847b199e6dcd09b676fba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape86f785e-63", "ovs_interfaceid": "e86f785e-63db-4bd0-92b9-6b813174c7a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): tee -a /sys/class/scsi_host/host36/scan {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-2527503c-2b26-4355-945d-94cba18792bd tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] No BDM found with device name vda, not building metadata. {{(pid=70374) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-2527503c-2b26-4355-945d-94cba18792bd tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] No BDM found with device name hda, not building metadata. {{(pid=70374) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-2527503c-2b26-4355-945d-94cba18792bd tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] No VIF found with MAC fa:16:3e:b2:9d:c3, not building metadata {{(pid=70374) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "tee -a /sys/class/scsi_host/host36/scan" returned: 0 in 0.019s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[4a0859e7-c7c6-4b9a-ab0d-14264e151c3c]: (4, ('- - 0', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Searching for a device in session 7 and hctl ['36', '*', '*', 0] yield: sde {{(pid=70374) device_name_by_hctl /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:713}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Connected to sde using {'target_iqn': 'iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e', 'target_portal': '172.16.0.220:3260', 'target_discovered': False, 'auth_method': 'CHAP', 'auth_username': 'NYfpjgzV', 'auth_password': '***', 'target_lun': 0, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False} {{(pid=70374) _connect_vol /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:661}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-27aed475-3713-4f3d-8447-1f7fbcf75e85 req-4da4ed27-c1e1-49e2-b178-7aeb99a8a2bf service nova] Releasing lock "refresh_cache-b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06" {{(pid=70374) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG nova.virt.libvirt.guest [None req-2527503c-2b26-4355-945d-94cba18792bd tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] set metadata xml: Jul 27 09:32:47 user nova-compute[70374]: Jul 27 09:32:47 user nova-compute[70374]: tempest-device-tagging-server-175106609 Jul 27 09:32:47 user nova-compute[70374]: 2023-07-27 09:32:47 Jul 27 09:32:47 user nova-compute[70374]: Jul 27 09:32:47 user nova-compute[70374]: 512 Jul 27 09:32:47 user nova-compute[70374]: 1 Jul 27 09:32:47 user nova-compute[70374]: 0 Jul 27 09:32:47 user nova-compute[70374]: 0 Jul 27 09:32:47 user nova-compute[70374]: 1 Jul 27 09:32:47 user nova-compute[70374]: Jul 27 09:32:47 user nova-compute[70374]: Jul 27 09:32:47 user nova-compute[70374]: tempest-TaggedAttachmentsTest-493274998-project-member Jul 27 09:32:47 user nova-compute[70374]: tempest-TaggedAttachmentsTest-493274998 Jul 27 09:32:47 user nova-compute[70374]: Jul 27 09:32:47 user nova-compute[70374]: Jul 27 09:32:47 user nova-compute[70374]: Jul 27 09:32:47 user nova-compute[70374]: Jul 27 09:32:47 user nova-compute[70374]: Jul 27 09:32:47 user nova-compute[70374]: Jul 27 09:32:47 user nova-compute[70374]: Jul 27 09:32:47 user nova-compute[70374]: Jul 27 09:32:47 user nova-compute[70374]: Jul 27 09:32:47 user nova-compute[70374]: Jul 27 09:32:47 user nova-compute[70374]: Jul 27 09:32:47 user nova-compute[70374]: {{(pid=70374) set_metadata /opt/stack/nova/nova/virt/libvirt/guest.py:359}} Jul 27 09:32:47 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-2527503c-2b26-4355-945d-94cba18792bd tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Lock "interface-b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06-None" "released" by "nova.compute.manager.ComputeManager.attach_interface..do_attach_interface" :: held 3.237s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] No BDM found with device name vda, not building metadata. {{(pid=70374) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] No BDM found with device name vdb, not building metadata. {{(pid=70374) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] No VIF found with MAC fa:16:3e:2d:07:78, not building metadata {{(pid=70374) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-6d09fde6-af06-4e12-8545-49a102d0804e tempest-TestVolumeSwap-987419093 tempest-TestVolumeSwap-987419093-project-member] Lock "d35fe056-8279-479a-a673-6c61e5ec6933" "released" by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: held 9.970s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Acquiring lock "b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06" by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Lock "b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG nova.objects.instance [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Lazy-loading 'flavor' on Instance uuid b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-b1ba3c84-3fe0-421a-85d8-b1de42d96d9a tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Acquiring lock "773917c6-56d7-4491-a760-05f51593b7f0" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-b1ba3c84-3fe0-421a-85d8-b1de42d96d9a tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Lock "773917c6-56d7-4491-a760-05f51593b7f0" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-b1ba3c84-3fe0-421a-85d8-b1de42d96d9a tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Acquiring lock "773917c6-56d7-4491-a760-05f51593b7f0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-b1ba3c84-3fe0-421a-85d8-b1de42d96d9a tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Lock "773917c6-56d7-4491-a760-05f51593b7f0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-b1ba3c84-3fe0-421a-85d8-b1de42d96d9a tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Lock "773917c6-56d7-4491-a760-05f51593b7f0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:48 user nova-compute[70374]: INFO nova.compute.manager [None req-b1ba3c84-3fe0-421a-85d8-b1de42d96d9a tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Terminating instance Jul 27 09:32:48 user nova-compute[70374]: DEBUG nova.compute.manager [None req-b1ba3c84-3fe0-421a-85d8-b1de42d96d9a tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Start destroying the instance on the hypervisor. {{(pid=70374) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Lock "b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name..do_reserve" :: held 0.154s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Acquiring lock "b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06" by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Lock "b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06" acquired by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:48 user nova-compute[70374]: INFO nova.compute.manager [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Attaching volume 1bee565c-934c-49bb-9a23-cafd79caf6c9 to /dev/vdb Jul 27 09:32:48 user nova-compute[70374]: DEBUG os_brick.utils [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '10.0.0.210', 'multipath': False, 'enforce_multipath': True, 'host': 'user', 'execute': None}" {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:175}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.015s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[bb9c9e3b-80b1-4d8c-b901-498783156f0b]: (4, ('InitiatorName=iqn.2016-04.com.open-iscsi:ce3dd372cc44\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt / -n -o SOURCE {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "findmnt / -n -o SOURCE" returned: 0 in 0.028s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[53400302-4a49-428c-b3f6-ca5981863e57]: (4, ('/dev/mapper/vg0-lv--0\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): blkid /dev/mapper/vg0-lv--0 -s UUID -o value {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG nova.compute.manager [req-409e0c71-e0e8-4045-a339-756cb9f130a8 req-1e0f0b13-16f6-477f-9f49-0b1fa019438f service nova] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Received event network-vif-unplugged-c8fbf870-e2d7-4ec6-be17-98f3d9315f75 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-409e0c71-e0e8-4045-a339-756cb9f130a8 req-1e0f0b13-16f6-477f-9f49-0b1fa019438f service nova] Acquiring lock "773917c6-56d7-4491-a760-05f51593b7f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-409e0c71-e0e8-4045-a339-756cb9f130a8 req-1e0f0b13-16f6-477f-9f49-0b1fa019438f service nova] Lock "773917c6-56d7-4491-a760-05f51593b7f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-409e0c71-e0e8-4045-a339-756cb9f130a8 req-1e0f0b13-16f6-477f-9f49-0b1fa019438f service nova] Lock "773917c6-56d7-4491-a760-05f51593b7f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG nova.compute.manager [req-409e0c71-e0e8-4045-a339-756cb9f130a8 req-1e0f0b13-16f6-477f-9f49-0b1fa019438f service nova] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] No waiting events found dispatching network-vif-unplugged-c8fbf870-e2d7-4ec6-be17-98f3d9315f75 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG nova.compute.manager [req-409e0c71-e0e8-4045-a339-756cb9f130a8 req-1e0f0b13-16f6-477f-9f49-0b1fa019438f service nova] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Received event network-vif-unplugged-c8fbf870-e2d7-4ec6-be17-98f3d9315f75 for instance with task_state deleting. {{(pid=70374) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10810}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "blkid /dev/mapper/vg0-lv--0 -s UUID -o value" returned: 0 in 0.016s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[706630c9-2e1a-4928-b609-066b75b1291e]: (4, ('e95b3b51-542d-42ca-ac40-f83360608668\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Lock "connect_volume" "released" by "os_brick.initiator.connectors.iscsi.ISCSIConnector.connect_volume" :: held 1.402s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:87}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] <== connect_volume: return (1765ms) {'type': 'block', 'scsi_wwn': '26161313833653666', 'path': '/dev/sde'} {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:202}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG nova.virt.libvirt.volume.iscsi [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Attached iSCSI volume {'type': 'block', 'scsi_wwn': '26161313833653666', 'path': '/dev/sde'} {{(pid=70374) connect_volume /opt/stack/nova/nova/virt/libvirt/volume/iscsi.py:65}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[63033fe8-b7a7-4e7e-9a21-4b56ae0e3101]: (4, 'e20c3142-5af9-7467-ecd8-70b2e4a210d6') {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Running cmd (subprocess): nvme version {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] 'nvme version' failed. Not Retrying. {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:473}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.nvmeof [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] nvme not present on system {{(pid=70374) nvme_present /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/nvmeof.py:757}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): nvme show-hostnqn {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] 'nvme show-hostnqn' failed. Not Retrying. {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:473}} Jul 27 09:32:48 user nova-compute[70374]: WARNING os_brick.privileged.nvmeof [-] Could not generate host nqn: [Errno 2] No such file or directory: 'nvme' Jul 27 09:32:48 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[d1dde185-621c-4102-9136-b3100c62bb43]: (4, '') {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.lightos [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] LIGHTOS: [Errno 111] ECONNREFUSED {{(pid=70374) find_dsc /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/lightos.py:98}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.lightos [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] LIGHTOS: did not find dsc, continuing anyway. {{(pid=70374) get_connector_properties /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/lightos.py:76}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.lightos [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] LIGHTOS: no hostnqn found. {{(pid=70374) get_connector_properties /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/lightos.py:84}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG os_brick.utils [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] <== get_connector_properties: return (137ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '10.0.0.210', 'host': 'user', 'multipath': False, 'initiator': 'iqn.2016-04.com.open-iscsi:ce3dd372cc44', 'do_local_attach': False, 'uuid': 'e95b3b51-542d-42ca-ac40-f83360608668', 'system uuid': 'e20c3142-5af9-7467-ecd8-70b2e4a210d6', 'nvme_native_multipath': False} {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:202}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG nova.virt.block_device [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Updating existing volume attachment record: 99716295-d088-4bb8-bb69-cd7b54328f73 {{(pid=70374) _volume_attach /opt/stack/nova/nova/virt/block_device.py:631}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG nova.objects.instance [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Lazy-loading 'flavor' on Instance uuid 309c9c26-4a0f-45db-bb3a-595b19f3f627 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:32:48 user nova-compute[70374]: DEBUG nova.virt.libvirt.guest [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] attach device xml: Jul 27 09:32:48 user nova-compute[70374]: Jul 27 09:32:48 user nova-compute[70374]: Jul 27 09:32:48 user nova-compute[70374]: Jul 27 09:32:48 user nova-compute[70374]: 0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e Jul 27 09:32:48 user nova-compute[70374]:
Jul 27 09:32:48 user nova-compute[70374]: Jul 27 09:32:48 user nova-compute[70374]: {{(pid=70374) attach_device /opt/stack/nova/nova/virt/libvirt/guest.py:339}} Jul 27 09:32:49 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:49 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:49 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:49 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:49 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:49 user nova-compute[70374]: INFO nova.virt.libvirt.driver [-] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Instance destroyed successfully. Jul 27 09:32:49 user nova-compute[70374]: DEBUG nova.objects.instance [None req-b1ba3c84-3fe0-421a-85d8-b1de42d96d9a tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Lazy-loading 'resources' on Instance uuid 773917c6-56d7-4491-a760-05f51593b7f0 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:32:49 user nova-compute[70374]: DEBUG nova.virt.libvirt.vif [None req-b1ba3c84-3fe0-421a-85d8-b1de42d96d9a tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='',created_at=2023-07-27T09:30:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-1574025589',display_name='tempest-VolumesActionsTest-instance-1574025589',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(6),hidden=False,host='user',hostname='tempest-volumesactionstest-instance-1574025589',id=10,image_ref='35458adf-261a-4e0b-a4db-b243619b2394',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-07-27T09:31:00Z,launched_on='user',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='d88292245b8d4bbfa07efc8084ae089c',ramdisk_id='',reservation_id='r-dbzwfifo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='35458adf-261a-4e0b-a4db-b243619b2394',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-VolumesActionsTest-2082395054',owner_user_name='tempest-VolumesActionsTest-2082395054-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-07-27T09:31:00Z,user_data=None,user_id='459cb24159604669a118a1da67fbdf72',uuid=773917c6-56d7-4491-a760-05f51593b7f0,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c8fbf870-e2d7-4ec6-be17-98f3d9315f75", "address": "fa:16:3e:19:0c:8d", "network": {"id": "efa0e322-091b-4b62-b5f9-8a09fa192696", "bridge": "br-int", "label": "tempest-VolumesActionsTest-681513154-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d88292245b8d4bbfa07efc8084ae089c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8fbf870-e2", "ovs_interfaceid": "c8fbf870-e2d7-4ec6-be17-98f3d9315f75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Jul 27 09:32:49 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-b1ba3c84-3fe0-421a-85d8-b1de42d96d9a tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Converting VIF {"id": "c8fbf870-e2d7-4ec6-be17-98f3d9315f75", "address": "fa:16:3e:19:0c:8d", "network": {"id": "efa0e322-091b-4b62-b5f9-8a09fa192696", "bridge": "br-int", "label": "tempest-VolumesActionsTest-681513154-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d88292245b8d4bbfa07efc8084ae089c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8fbf870-e2", "ovs_interfaceid": "c8fbf870-e2d7-4ec6-be17-98f3d9315f75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Jul 27 09:32:49 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-b1ba3c84-3fe0-421a-85d8-b1de42d96d9a tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:0c:8d,bridge_name='br-int',has_traffic_filtering=True,id=c8fbf870-e2d7-4ec6-be17-98f3d9315f75,network=Network(efa0e322-091b-4b62-b5f9-8a09fa192696),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8fbf870-e2') {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Jul 27 09:32:49 user nova-compute[70374]: DEBUG os_vif [None req-b1ba3c84-3fe0-421a-85d8-b1de42d96d9a tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:0c:8d,bridge_name='br-int',has_traffic_filtering=True,id=c8fbf870-e2d7-4ec6-be17-98f3d9315f75,network=Network(efa0e322-091b-4b62-b5f9-8a09fa192696),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8fbf870-e2') {{(pid=70374) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Jul 27 09:32:49 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:49 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc8fbf870-e2, bridge=br-int, if_exists=True) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:32:49 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:49 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Jul 27 09:32:49 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:49 user nova-compute[70374]: INFO os_vif [None req-b1ba3c84-3fe0-421a-85d8-b1de42d96d9a tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:0c:8d,bridge_name='br-int',has_traffic_filtering=True,id=c8fbf870-e2d7-4ec6-be17-98f3d9315f75,network=Network(efa0e322-091b-4b62-b5f9-8a09fa192696),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8fbf870-e2') Jul 27 09:32:49 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-b1ba3c84-3fe0-421a-85d8-b1de42d96d9a tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Deleting instance files /opt/stack/data/nova/instances/773917c6-56d7-4491-a760-05f51593b7f0_del Jul 27 09:32:49 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-b1ba3c84-3fe0-421a-85d8-b1de42d96d9a tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Deletion of /opt/stack/data/nova/instances/773917c6-56d7-4491-a760-05f51593b7f0_del complete Jul 27 09:32:49 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] No BDM found with device name sda, not building metadata. {{(pid=70374) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Jul 27 09:32:49 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] No BDM found with device name sdb, not building metadata. {{(pid=70374) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Jul 27 09:32:49 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] No BDM found with device name sdc, not building metadata. {{(pid=70374) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Jul 27 09:32:49 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] No VIF found with MAC fa:16:3e:57:6c:c6, not building metadata {{(pid=70374) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Jul 27 09:32:49 user nova-compute[70374]: INFO nova.compute.manager [None req-b1ba3c84-3fe0-421a-85d8-b1de42d96d9a tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Took 1.32 seconds to destroy the instance on the hypervisor. Jul 27 09:32:49 user nova-compute[70374]: DEBUG oslo.service.loopingcall [None req-b1ba3c84-3fe0-421a-85d8-b1de42d96d9a tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70374) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Jul 27 09:32:49 user nova-compute[70374]: DEBUG nova.compute.manager [-] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Deallocating network for instance {{(pid=70374) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} Jul 27 09:32:49 user nova-compute[70374]: DEBUG nova.network.neutron [-] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] deallocate_for_instance() {{(pid=70374) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} Jul 27 09:32:49 user nova-compute[70374]: DEBUG nova.compute.manager [req-8dd09c00-738c-4b38-af18-5fa8f0a793a0 req-1251ce73-c4a5-4ac4-9670-385412adb42a service nova] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Received event network-vif-plugged-e86f785e-63db-4bd0-92b9-6b813174c7a5 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:32:49 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-8dd09c00-738c-4b38-af18-5fa8f0a793a0 req-1251ce73-c4a5-4ac4-9670-385412adb42a service nova] Acquiring lock "b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:49 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-8dd09c00-738c-4b38-af18-5fa8f0a793a0 req-1251ce73-c4a5-4ac4-9670-385412adb42a service nova] Lock "b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:49 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-8dd09c00-738c-4b38-af18-5fa8f0a793a0 req-1251ce73-c4a5-4ac4-9670-385412adb42a service nova] Lock "b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:49 user nova-compute[70374]: DEBUG nova.compute.manager [req-8dd09c00-738c-4b38-af18-5fa8f0a793a0 req-1251ce73-c4a5-4ac4-9670-385412adb42a service nova] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] No waiting events found dispatching network-vif-plugged-e86f785e-63db-4bd0-92b9-6b813174c7a5 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:32:49 user nova-compute[70374]: WARNING nova.compute.manager [req-8dd09c00-738c-4b38-af18-5fa8f0a793a0 req-1251ce73-c4a5-4ac4-9670-385412adb42a service nova] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Received unexpected event network-vif-plugged-e86f785e-63db-4bd0-92b9-6b813174c7a5 for instance with vm_state active and task_state None. Jul 27 09:32:49 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:49 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-db80aa10-c0ae-4202-99c3-a07c936df96c tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Lock "309c9c26-4a0f-45db-bb3a-595b19f3f627" "released" by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: held 10.518s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:49 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:49 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:49 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:49 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:49 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] Acquiring lock "d35fe056-8279-479a-a673-6c61e5ec6933" by "nova.compute.manager.ComputeManager.swap_volume.._do_locked_swap_volume" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] Lock "d35fe056-8279-479a-a673-6c61e5ec6933" acquired by "nova.compute.manager.ComputeManager.swap_volume.._do_locked_swap_volume" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG os_brick.utils [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '10.0.0.210', 'multipath': False, 'enforce_multipath': True, 'host': 'user', 'execute': None}" {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:175}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.015s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[adec6ea5-61d6-48f7-bbbc-e061d8d285f2]: (4, ('InitiatorName=iqn.2016-04.com.open-iscsi:ce3dd372cc44\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt / -n -o SOURCE {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "findmnt / -n -o SOURCE" returned: 0 in 0.019s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[cc77c665-0422-4f62-9c26-c94b2d68167d]: (4, ('/dev/mapper/vg0-lv--0\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): blkid /dev/mapper/vg0-lv--0 -s UUID -o value {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "blkid /dev/mapper/vg0-lv--0 -s UUID -o value" returned: 0 in 0.020s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[a9b9834f-a4bf-43ff-83db-a21b9bd5dfd0]: (4, ('e95b3b51-542d-42ca-ac40-f83360608668\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[7cd1b524-cb22-4902-9fff-36f6f5b7bc3c]: (4, 'e20c3142-5af9-7467-ecd8-70b2e4a210d6') {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] Running cmd (subprocess): nvme version {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] 'nvme version' failed. Not Retrying. {{(pid=70374) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:473}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.nvmeof [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] nvme not present on system {{(pid=70374) nvme_present /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/nvmeof.py:757}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): nvme show-hostnqn {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] 'nvme show-hostnqn' failed. Not Retrying. {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:473}} Jul 27 09:32:50 user nova-compute[70374]: WARNING os_brick.privileged.nvmeof [-] Could not generate host nqn: [Errno 2] No such file or directory: 'nvme' Jul 27 09:32:50 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[a587e9dc-3181-4eb0-bfc2-8f2f4369e23e]: (4, '') {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.lightos [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] LIGHTOS: [Errno 111] ECONNREFUSED {{(pid=70374) find_dsc /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/lightos.py:98}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.lightos [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] LIGHTOS: did not find dsc, continuing anyway. {{(pid=70374) get_connector_properties /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/lightos.py:76}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.lightos [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] LIGHTOS: no hostnqn found. {{(pid=70374) get_connector_properties /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/lightos.py:84}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG os_brick.utils [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] <== get_connector_properties: return (141ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '10.0.0.210', 'host': 'user', 'multipath': False, 'initiator': 'iqn.2016-04.com.open-iscsi:ce3dd372cc44', 'do_local_attach': False, 'uuid': 'e95b3b51-542d-42ca-ac40-f83360608668', 'system uuid': 'e20c3142-5af9-7467-ecd8-70b2e4a210d6', 'nvme_native_multipath': False} {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:202}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:50 user nova-compute[70374]: INFO nova.compute.manager [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Swapping volume 25fa94b4-c6dc-4dbd-add9-e3a58523baac for 4f6a5d8f-095f-487e-b752-c9b1ce301b09 Jul 27 09:32:50 user nova-compute[70374]: DEBUG nova.network.neutron [-] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Updating instance_info_cache with network_info: [] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:32:50 user nova-compute[70374]: INFO nova.compute.manager [-] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Took 0.67 seconds to deallocate network for instance. Jul 27 09:32:50 user nova-compute[70374]: DEBUG nova.compute.manager [req-01e60465-c843-412b-ad29-30e41157447a req-32aba15b-3105-4746-a4f5-dc0fcc557c17 service nova] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Received event network-vif-deleted-c8fbf870-e2d7-4ec6-be17-98f3d9315f75 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-b1ba3c84-3fe0-421a-85d8-b1de42d96d9a tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-b1ba3c84-3fe0-421a-85d8-b1de42d96d9a tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG nova.compute.provider_tree [None req-b1ba3c84-3fe0-421a-85d8-b1de42d96d9a tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Inventory has not changed in ProviderTree for provider: f7548644-4a09-4ad8-9aa6-6e05d85a9f5b {{(pid=70374) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG nova.scheduler.client.report [None req-b1ba3c84-3fe0-421a-85d8-b1de42d96d9a tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Inventory has not changed for provider f7548644-4a09-4ad8-9aa6-6e05d85a9f5b based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16011, 'reserved': 512, 'min_unit': 1, 'max_unit': 16011, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70374) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG nova.compute.manager [req-ddee4fb7-55a7-4609-968a-be73d8243058 req-1c6802d1-dc83-4faf-8e74-90d5b71fe4f3 service nova] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Received event network-vif-plugged-c8fbf870-e2d7-4ec6-be17-98f3d9315f75 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-ddee4fb7-55a7-4609-968a-be73d8243058 req-1c6802d1-dc83-4faf-8e74-90d5b71fe4f3 service nova] Acquiring lock "773917c6-56d7-4491-a760-05f51593b7f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-ddee4fb7-55a7-4609-968a-be73d8243058 req-1c6802d1-dc83-4faf-8e74-90d5b71fe4f3 service nova] Lock "773917c6-56d7-4491-a760-05f51593b7f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-ddee4fb7-55a7-4609-968a-be73d8243058 req-1c6802d1-dc83-4faf-8e74-90d5b71fe4f3 service nova] Lock "773917c6-56d7-4491-a760-05f51593b7f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG nova.compute.manager [req-ddee4fb7-55a7-4609-968a-be73d8243058 req-1c6802d1-dc83-4faf-8e74-90d5b71fe4f3 service nova] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] No waiting events found dispatching network-vif-plugged-c8fbf870-e2d7-4ec6-be17-98f3d9315f75 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:32:50 user nova-compute[70374]: WARNING nova.compute.manager [req-ddee4fb7-55a7-4609-968a-be73d8243058 req-1c6802d1-dc83-4faf-8e74-90d5b71fe4f3 service nova] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Received unexpected event network-vif-plugged-c8fbf870-e2d7-4ec6-be17-98f3d9315f75 for instance with vm_state deleted and task_state None. Jul 27 09:32:50 user nova-compute[70374]: DEBUG nova.compute.manager [req-ddee4fb7-55a7-4609-968a-be73d8243058 req-1c6802d1-dc83-4faf-8e74-90d5b71fe4f3 service nova] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Received event network-vif-plugged-c8fbf870-e2d7-4ec6-be17-98f3d9315f75 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-ddee4fb7-55a7-4609-968a-be73d8243058 req-1c6802d1-dc83-4faf-8e74-90d5b71fe4f3 service nova] Acquiring lock "773917c6-56d7-4491-a760-05f51593b7f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-ddee4fb7-55a7-4609-968a-be73d8243058 req-1c6802d1-dc83-4faf-8e74-90d5b71fe4f3 service nova] Lock "773917c6-56d7-4491-a760-05f51593b7f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-ddee4fb7-55a7-4609-968a-be73d8243058 req-1c6802d1-dc83-4faf-8e74-90d5b71fe4f3 service nova] Lock "773917c6-56d7-4491-a760-05f51593b7f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG nova.compute.manager [req-ddee4fb7-55a7-4609-968a-be73d8243058 req-1c6802d1-dc83-4faf-8e74-90d5b71fe4f3 service nova] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] No waiting events found dispatching network-vif-plugged-c8fbf870-e2d7-4ec6-be17-98f3d9315f75 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:32:50 user nova-compute[70374]: WARNING nova.compute.manager [req-ddee4fb7-55a7-4609-968a-be73d8243058 req-1c6802d1-dc83-4faf-8e74-90d5b71fe4f3 service nova] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Received unexpected event network-vif-plugged-c8fbf870-e2d7-4ec6-be17-98f3d9315f75 for instance with vm_state deleted and task_state None. Jul 27 09:32:50 user nova-compute[70374]: DEBUG nova.compute.manager [req-ddee4fb7-55a7-4609-968a-be73d8243058 req-1c6802d1-dc83-4faf-8e74-90d5b71fe4f3 service nova] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Received event network-vif-plugged-c8fbf870-e2d7-4ec6-be17-98f3d9315f75 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-ddee4fb7-55a7-4609-968a-be73d8243058 req-1c6802d1-dc83-4faf-8e74-90d5b71fe4f3 service nova] Acquiring lock "773917c6-56d7-4491-a760-05f51593b7f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-ddee4fb7-55a7-4609-968a-be73d8243058 req-1c6802d1-dc83-4faf-8e74-90d5b71fe4f3 service nova] Lock "773917c6-56d7-4491-a760-05f51593b7f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-ddee4fb7-55a7-4609-968a-be73d8243058 req-1c6802d1-dc83-4faf-8e74-90d5b71fe4f3 service nova] Lock "773917c6-56d7-4491-a760-05f51593b7f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG nova.compute.manager [req-ddee4fb7-55a7-4609-968a-be73d8243058 req-1c6802d1-dc83-4faf-8e74-90d5b71fe4f3 service nova] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] No waiting events found dispatching network-vif-plugged-c8fbf870-e2d7-4ec6-be17-98f3d9315f75 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:32:50 user nova-compute[70374]: WARNING nova.compute.manager [req-ddee4fb7-55a7-4609-968a-be73d8243058 req-1c6802d1-dc83-4faf-8e74-90d5b71fe4f3 service nova] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Received unexpected event network-vif-plugged-c8fbf870-e2d7-4ec6-be17-98f3d9315f75 for instance with vm_state deleted and task_state None. Jul 27 09:32:50 user nova-compute[70374]: DEBUG nova.compute.manager [req-ddee4fb7-55a7-4609-968a-be73d8243058 req-1c6802d1-dc83-4faf-8e74-90d5b71fe4f3 service nova] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Received event network-vif-unplugged-c8fbf870-e2d7-4ec6-be17-98f3d9315f75 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-ddee4fb7-55a7-4609-968a-be73d8243058 req-1c6802d1-dc83-4faf-8e74-90d5b71fe4f3 service nova] Acquiring lock "773917c6-56d7-4491-a760-05f51593b7f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-ddee4fb7-55a7-4609-968a-be73d8243058 req-1c6802d1-dc83-4faf-8e74-90d5b71fe4f3 service nova] Lock "773917c6-56d7-4491-a760-05f51593b7f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-ddee4fb7-55a7-4609-968a-be73d8243058 req-1c6802d1-dc83-4faf-8e74-90d5b71fe4f3 service nova] Lock "773917c6-56d7-4491-a760-05f51593b7f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG nova.compute.manager [req-ddee4fb7-55a7-4609-968a-be73d8243058 req-1c6802d1-dc83-4faf-8e74-90d5b71fe4f3 service nova] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] No waiting events found dispatching network-vif-unplugged-c8fbf870-e2d7-4ec6-be17-98f3d9315f75 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:32:50 user nova-compute[70374]: WARNING nova.compute.manager [req-ddee4fb7-55a7-4609-968a-be73d8243058 req-1c6802d1-dc83-4faf-8e74-90d5b71fe4f3 service nova] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Received unexpected event network-vif-unplugged-c8fbf870-e2d7-4ec6-be17-98f3d9315f75 for instance with vm_state deleted and task_state None. Jul 27 09:32:50 user nova-compute[70374]: DEBUG nova.compute.manager [req-ddee4fb7-55a7-4609-968a-be73d8243058 req-1c6802d1-dc83-4faf-8e74-90d5b71fe4f3 service nova] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Received event network-vif-plugged-c8fbf870-e2d7-4ec6-be17-98f3d9315f75 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-ddee4fb7-55a7-4609-968a-be73d8243058 req-1c6802d1-dc83-4faf-8e74-90d5b71fe4f3 service nova] Acquiring lock "773917c6-56d7-4491-a760-05f51593b7f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-ddee4fb7-55a7-4609-968a-be73d8243058 req-1c6802d1-dc83-4faf-8e74-90d5b71fe4f3 service nova] Lock "773917c6-56d7-4491-a760-05f51593b7f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-ddee4fb7-55a7-4609-968a-be73d8243058 req-1c6802d1-dc83-4faf-8e74-90d5b71fe4f3 service nova] Lock "773917c6-56d7-4491-a760-05f51593b7f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG nova.compute.manager [req-ddee4fb7-55a7-4609-968a-be73d8243058 req-1c6802d1-dc83-4faf-8e74-90d5b71fe4f3 service nova] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] No waiting events found dispatching network-vif-plugged-c8fbf870-e2d7-4ec6-be17-98f3d9315f75 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:32:50 user nova-compute[70374]: WARNING nova.compute.manager [req-ddee4fb7-55a7-4609-968a-be73d8243058 req-1c6802d1-dc83-4faf-8e74-90d5b71fe4f3 service nova] [instance: 773917c6-56d7-4491-a760-05f51593b7f0] Received unexpected event network-vif-plugged-c8fbf870-e2d7-4ec6-be17-98f3d9315f75 for instance with vm_state deleted and task_state None. Jul 27 09:32:50 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-f0d0dbe6-77e0-4072-b67b-27bb8a0bcf99 tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Acquiring lock "309c9c26-4a0f-45db-bb3a-595b19f3f627" by "nova.compute.manager.ComputeManager.detach_volume..do_detach_volume" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-f0d0dbe6-77e0-4072-b67b-27bb8a0bcf99 tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Lock "309c9c26-4a0f-45db-bb3a-595b19f3f627" acquired by "nova.compute.manager.ComputeManager.detach_volume..do_detach_volume" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:50 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-b1ba3c84-3fe0-421a-85d8-b1de42d96d9a tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.491s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:50 user nova-compute[70374]: INFO nova.compute.manager [None req-f0d0dbe6-77e0-4072-b67b-27bb8a0bcf99 tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Detaching volume 0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e Jul 27 09:32:51 user nova-compute[70374]: INFO nova.scheduler.client.report [None req-b1ba3c84-3fe0-421a-85d8-b1de42d96d9a tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Deleted allocations for instance 773917c6-56d7-4491-a760-05f51593b7f0 Jul 27 09:32:51 user nova-compute[70374]: INFO nova.virt.block_device [None req-f0d0dbe6-77e0-4072-b67b-27bb8a0bcf99 tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Attempting to driver detach volume 0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e from mountpoint /dev/sdc Jul 27 09:32:51 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-f0d0dbe6-77e0-4072-b67b-27bb8a0bcf99 tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Attempting to detach device sdc from instance 309c9c26-4a0f-45db-bb3a-595b19f3f627 from the persistent domain config. {{(pid=70374) _detach_from_persistent /opt/stack/nova/nova/virt/libvirt/driver.py:2477}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG nova.virt.libvirt.guest [None req-f0d0dbe6-77e0-4072-b67b-27bb8a0bcf99 tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] detach device xml: Jul 27 09:32:51 user nova-compute[70374]: Jul 27 09:32:51 user nova-compute[70374]: Jul 27 09:32:51 user nova-compute[70374]: Jul 27 09:32:51 user nova-compute[70374]: 0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e Jul 27 09:32:51 user nova-compute[70374]:
Jul 27 09:32:51 user nova-compute[70374]: Jul 27 09:32:51 user nova-compute[70374]: {{(pid=70374) detach_device /opt/stack/nova/nova/virt/libvirt/guest.py:465}} Jul 27 09:32:51 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-f0d0dbe6-77e0-4072-b67b-27bb8a0bcf99 tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Successfully detached device sdc from instance 309c9c26-4a0f-45db-bb3a-595b19f3f627 from the persistent domain config. Jul 27 09:32:51 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-f0d0dbe6-77e0-4072-b67b-27bb8a0bcf99 tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] (1/8): Attempting to detach device sdc with device alias scsi0-0-0-2 from instance 309c9c26-4a0f-45db-bb3a-595b19f3f627 from the live domain config. {{(pid=70374) _detach_from_live_with_retry /opt/stack/nova/nova/virt/libvirt/driver.py:2513}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG nova.virt.libvirt.guest [None req-f0d0dbe6-77e0-4072-b67b-27bb8a0bcf99 tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] detach device xml: Jul 27 09:32:51 user nova-compute[70374]: Jul 27 09:32:51 user nova-compute[70374]: Jul 27 09:32:51 user nova-compute[70374]: Jul 27 09:32:51 user nova-compute[70374]: 0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e Jul 27 09:32:51 user nova-compute[70374]:
Jul 27 09:32:51 user nova-compute[70374]: Jul 27 09:32:51 user nova-compute[70374]: {{(pid=70374) detach_device /opt/stack/nova/nova/virt/libvirt/guest.py:465}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-b1ba3c84-3fe0-421a-85d8-b1de42d96d9a tempest-VolumesActionsTest-2082395054 tempest-VolumesActionsTest-2082395054-project-member] Lock "773917c6-56d7-4491-a760-05f51593b7f0" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.714s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Received event scsi0-0-0-2> from libvirt while the driver is waiting for it; dispatched. {{(pid=70374) emit_event /opt/stack/nova/nova/virt/libvirt/driver.py:2360}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-f0d0dbe6-77e0-4072-b67b-27bb8a0bcf99 tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Start waiting for the detach event from libvirt for device sdc with device alias scsi0-0-0-2 for instance 309c9c26-4a0f-45db-bb3a-595b19f3f627 {{(pid=70374) _detach_from_live_and_wait_for_event /opt/stack/nova/nova/virt/libvirt/driver.py:2589}} Jul 27 09:32:51 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-f0d0dbe6-77e0-4072-b67b-27bb8a0bcf99 tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Successfully detached device sdc from instance 309c9c26-4a0f-45db-bb3a-595b19f3f627 from the live domain config. Jul 27 09:32:51 user nova-compute[70374]: DEBUG nova.virt.libvirt.volume.iscsi [None req-f0d0dbe6-77e0-4072-b67b-27bb8a0bcf99 tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] calling os-brick to detach iSCSI Volume {{(pid=70374) disconnect_volume /opt/stack/nova/nova/virt/libvirt/volume/iscsi.py:72}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-f0d0dbe6-77e0-4072-b67b-27bb8a0bcf99 tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] ==> disconnect_volume: call "{'args': (, {'target_iqn': 'iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e', 'target_portal': '172.16.0.220:3260', 'target_discovered': False, 'auth_method': 'CHAP', 'auth_username': 'NYfpjgzV', 'auth_password': '***', 'target_lun': 0, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False, 'device_path': '/dev/sde'}, None), 'kwargs': False}}" {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:175}} Jul 27 09:32:51 user nova-compute[70374]: WARNING os_brick.initiator.connectors.base [None req-f0d0dbe6-77e0-4072-b67b-27bb8a0bcf99 tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Service needs to call os_brick.setup() before connecting volumes, if it doesn't it will break on the next release Jul 27 09:32:51 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-f0d0dbe6-77e0-4072-b67b-27bb8a0bcf99 tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Acquiring lock "connect_volume" by "os_brick.initiator.connectors.iscsi.ISCSIConnector.disconnect_volume" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:68}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-f0d0dbe6-77e0-4072-b67b-27bb8a0bcf99 tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Lock "connect_volume" acquired by "os_brick.initiator.connectors.iscsi.ISCSIConnector.disconnect_volume" :: waited 0.002s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:73}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m discoverydb -o show -P 1 {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m discoverydb -o show -P 1" returned: 0 in 0.019s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[f8ce92dd-1820-483d-bc59-cbef9bb070ee]: (4, ('SENDTARGETS:\nNo targets found.\niSNS:\nNo targets found.\nSTATIC:\nTarget: iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291\n\tPortal: 172.16.0.220:3260,-1\n\t\tIface Name: default\nTarget: iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e\n\tPortal: 172.16.0.220:3260,-1\n\t\tIface Name: default\nTarget: iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251\n\tPortal: 172.16.0.220:3260,-1\n\t\tIface Name: default\nTarget: iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac\n\tPortal: 172.16.0.220:3260,-1\n\t\tIface Name: default\nFIRMWARE:\nNo targets found.\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-f0d0dbe6-77e0-4072-b67b-27bb8a0bcf99 tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] iscsiadm ['-m', 'discoverydb', '-o', 'show', '-P', 1]: stdout=SENDTARGETS: Jul 27 09:32:51 user nova-compute[70374]: No targets found. Jul 27 09:32:51 user nova-compute[70374]: iSNS: Jul 27 09:32:51 user nova-compute[70374]: No targets found. Jul 27 09:32:51 user nova-compute[70374]: STATIC: Jul 27 09:32:51 user nova-compute[70374]: Target: iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 Jul 27 09:32:51 user nova-compute[70374]: Portal: 172.16.0.220:3260,-1 Jul 27 09:32:51 user nova-compute[70374]: Iface Name: default Jul 27 09:32:51 user nova-compute[70374]: Target: iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e Jul 27 09:32:51 user nova-compute[70374]: Portal: 172.16.0.220:3260,-1 Jul 27 09:32:51 user nova-compute[70374]: Iface Name: default Jul 27 09:32:51 user nova-compute[70374]: Target: iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 Jul 27 09:32:51 user nova-compute[70374]: Portal: 172.16.0.220:3260,-1 Jul 27 09:32:51 user nova-compute[70374]: Iface Name: default Jul 27 09:32:51 user nova-compute[70374]: Target: iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac Jul 27 09:32:51 user nova-compute[70374]: Portal: 172.16.0.220:3260,-1 Jul 27 09:32:51 user nova-compute[70374]: Iface Name: default Jul 27 09:32:51 user nova-compute[70374]: FIRMWARE: Jul 27 09:32:51 user nova-compute[70374]: No targets found. Jul 27 09:32:51 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm_bare /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1182}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-f0d0dbe6-77e0-4072-b67b-27bb8a0bcf99 tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Regex to get portals from discoverydb: ^SENDTARGETS: Jul 27 09:32:51 user nova-compute[70374]: .*?^DiscoveryAddress: 172.16.0.220,3260.*? Jul 27 09:32:51 user nova-compute[70374]: (.*?)^(?:DiscoveryAddress|iSNS):.* {{(pid=70374) _get_discoverydb_portals /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:371}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-f0d0dbe6-77e0-4072-b67b-27bb8a0bcf99 tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Getting connected devices for (ips,iqns,luns)=[('172.16.0.220:3260', 'iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e', 0)] {{(pid=70374) _get_connection_devices /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:819}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node" returned: 0 in 0.017s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[42bf901d-13cf-497a-975b-f2ff13ecd5c2]: (4, ('172.16.0.220:3260,4294967295 iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e\n172.16.0.220:3260,4294967295 iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac\n172.16.0.220:3260,4294967295 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291\n172.16.0.220:3260,4294967295 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m session {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m session" returned: 0 in 0.026s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[70277205-62f6-43fb-b0f3-a2bb07d6fdd9]: (4, ('tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash)\ntcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash)\ntcp: [6] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac (non-flash)\ntcp: [7] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e (non-flash)\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-f0d0dbe6-77e0-4072-b67b-27bb8a0bcf99 tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] iscsiadm ('-m', 'session'): stdout=tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash) Jul 27 09:32:51 user nova-compute[70374]: tcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash) Jul 27 09:32:51 user nova-compute[70374]: tcp: [6] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac (non-flash) Jul 27 09:32:51 user nova-compute[70374]: tcp: [7] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e (non-flash) Jul 27 09:32:51 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm_bare /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1182}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-f0d0dbe6-77e0-4072-b67b-27bb8a0bcf99 tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] iscsi session list stdout=tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash) Jul 27 09:32:51 user nova-compute[70374]: tcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash) Jul 27 09:32:51 user nova-compute[70374]: tcp: [6] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac (non-flash) Jul 27 09:32:51 user nova-compute[70374]: tcp: [7] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e (non-flash) Jul 27 09:32:51 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsi_session /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1171}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-f0d0dbe6-77e0-4072-b67b-27bb8a0bcf99 tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Resulting device map defaultdict(. at 0x7f280b7c4d30>, {('172.16.0.220:3260', 'iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e'): ({'sde'}, set())}) {{(pid=70374) _get_connection_devices /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:852}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-f0d0dbe6-77e0-4072-b67b-27bb8a0bcf99 tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Removing single pathed devices sde {{(pid=70374) remove_connection /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:309}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.020s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[10b9b011-c93b-4465-b5e5-51e4a17fe24a]: (4, ('path checker states:\nup 4\n\npaths: 0\nbusy: False\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd del path /dev/sde {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "multipathd del path /dev/sde" returned: 0 in 0.018s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[bf3849a7-e892-460d-80c5-7ce3b407266a]: (4, ('ok\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-f0d0dbe6-77e0-4072-b67b-27bb8a0bcf99 tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Flushing IO for device /dev/sde {{(pid=70374) flush_device_io /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:369}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): blockdev --flushbufs /dev/sde {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "blockdev --flushbufs /dev/sde" returned: 0 in 0.012s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[0d4160c1-0948-4b1e-bc26-9aa6a56ab6aa]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-f0d0dbe6-77e0-4072-b67b-27bb8a0bcf99 tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Remove SCSI device /dev/sde with /sys/block/sde/device/delete {{(pid=70374) remove_scsi_device /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:83}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): tee -a /sys/block/sde/device/delete {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "tee -a /sys/block/sde/device/delete" returned: 0 in 0.046s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[1f26a147-4047-4ee0-a72c-af1ba4283739]: (4, ('1', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-f0d0dbe6-77e0-4072-b67b-27bb8a0bcf99 tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Checking to see if SCSI volumes sde have been removed. {{(pid=70374) wait_for_volumes_removal /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:91}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-f0d0dbe6-77e0-4072-b67b-27bb8a0bcf99 tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] SCSI volumes sde have been removed. {{(pid=70374) wait_for_volumes_removal /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:101}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-f0d0dbe6-77e0-4072-b67b-27bb8a0bcf99 tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Disconnecting from: [('172.16.0.220:3260', 'iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e')] {{(pid=70374) _disconnect_connection /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1160}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e -p 172.16.0.220:3260 --op update -n node.startup -v manual {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e -p 172.16.0.220:3260 --op update -n node.startup -v manual" returned: 0 in 0.018s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[3accd9a1-8040-4f9f-b590-52cbba8f1d8d]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-f0d0dbe6-77e0-4072-b67b-27bb8a0bcf99 tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] iscsiadm ('--op', 'update', '-n', 'node.startup', '-v', 'manual'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e -p 172.16.0.220:3260 --logout {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e -p 172.16.0.220:3260 --logout" returned: 0 in 0.064s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[fdff1cc4-91dc-43e4-b8f1-2c586e0e4195]: (4, ('Logging out of session [sid: 7, target: iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e, portal: 172.16.0.220,3260]\nLogout of [sid: 7, target: iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e, portal: 172.16.0.220,3260] successful.\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-f0d0dbe6-77e0-4072-b67b-27bb8a0bcf99 tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] iscsiadm ('--logout',): stdout=Logging out of session [sid: 7, target: iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e, portal: 172.16.0.220,3260] Jul 27 09:32:51 user nova-compute[70374]: Logout of [sid: 7, target: iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e, portal: 172.16.0.220,3260] successful. Jul 27 09:32:51 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e -p 172.16.0.220:3260 --op delete {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:0ec6fb44-65f2-48c6-9c3b-900d51d5dc0e -p 172.16.0.220:3260 --op delete" returned: 0 in 0.024s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[de295849-2b00-4325-b890-8d9883df4f00]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-f0d0dbe6-77e0-4072-b67b-27bb8a0bcf99 tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] iscsiadm ('--op', 'delete'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-f0d0dbe6-77e0-4072-b67b-27bb8a0bcf99 tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Lock "connect_volume" "released" by "os_brick.initiator.connectors.iscsi.ISCSIConnector.disconnect_volume" :: held 0.309s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:87}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-f0d0dbe6-77e0-4072-b67b-27bb8a0bcf99 tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] <== disconnect_volume: return (311ms) None {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:202}} Jul 27 09:32:51 user nova-compute[70374]: DEBUG nova.virt.libvirt.volume.iscsi [None req-f0d0dbe6-77e0-4072-b67b-27bb8a0bcf99 tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Disconnected iSCSI Volume {{(pid=70374) disconnect_volume /opt/stack/nova/nova/virt/libvirt/volume/iscsi.py:79}} Jul 27 09:32:54 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:54 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG nova.virt.libvirt.volume.iscsi [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Calling os-brick to attach iSCSI Volume {{(pid=70374) connect_volume /opt/stack/nova/nova/virt/libvirt/volume/iscsi.py:63}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] ==> connect_volume: call "{'self': , 'connection_properties': {'target_iqn': 'iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9', 'target_portal': '172.16.0.220:3260', 'target_discovered': False, 'auth_method': 'CHAP', 'auth_username': 'rAiDNkvz', 'auth_password': '***', 'target_lun': 0, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False}}" {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:175}} Jul 27 09:32:56 user nova-compute[70374]: WARNING os_brick.initiator.connectors.base [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Service needs to call os_brick.setup() before connecting volumes, if it doesn't it will break on the next release Jul 27 09:32:56 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Acquiring lock "connect_volume" by "os_brick.initiator.connectors.iscsi.ISCSIConnector.connect_volume" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:68}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Lock "connect_volume" acquired by "os_brick.initiator.connectors.iscsi.ISCSIConnector.connect_volume" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:73}} Jul 27 09:32:56 user nova-compute[70374]: WARNING os_brick.initiator.connectors.base [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Service needs to call os_brick.setup() before connecting volumes, if it doesn't it will break on the next release Jul 27 09:32:56 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Acquiring lock "connect_to_iscsi_portal-172.16.0.220:3260-iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9" by "os_brick.initiator.connectors.iscsi.ISCSIConnector._connect_to_iscsi_portal_unsafe" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:68}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Lock "connect_to_iscsi_portal-172.16.0.220:3260-iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9" acquired by "os_brick.initiator.connectors.iscsi.ISCSIConnector._connect_to_iscsi_portal_unsafe" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:73}} Jul 27 09:32:56 user nova-compute[70374]: INFO os_brick.initiator.connectors.iscsi [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Trying to connect to iSCSI portal 172.16.0.220:3260 Jul 27 09:32:56 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9 -p 172.16.0.220:3260 {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9 -p 172.16.0.220:3260" returned: 21 in 0.018s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[18b226cc-afdc-4892-b13e-50bf7c297310]: (4, ('', 'iscsiadm: No records found\n')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] iscsiadm (): stdout= stderr=iscsiadm: No records found Jul 27 09:32:56 user nova-compute[70374]: {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9 -p 172.16.0.220:3260 --interface default --op new {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9 -p 172.16.0.220:3260 --interface default --op new" returned: 0 in 0.018s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[9877d1fa-34c7-4671-97eb-35cbc95f66bf]: (4, ('New iSCSI node [tcp:[hw=,ip=,net_if=,iscsi_if=default] 172.16.0.220,3260,-1 iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9] added\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] iscsiadm ('--interface', 'default', '--op', 'new'): stdout=New iSCSI node [tcp:[hw=,ip=,net_if=,iscsi_if=default] 172.16.0.220,3260,-1 iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9] added Jul 27 09:32:56 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9 -p 172.16.0.220:3260 --op update -n node.session.scan -v manual {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9 -p 172.16.0.220:3260 --op update -n node.session.scan -v manual" returned: 0 in 0.019s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[955fc122-67e9-422d-a80f-934f6746622e]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] iscsiadm ('--op', 'update', '-n', 'node.session.scan', '-v', 'manual'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9 -p 172.16.0.220:3260 --op update -n node.session.auth.authmethod -v CHAP {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9 -p 172.16.0.220:3260 --op update -n node.session.auth.authmethod -v CHAP" returned: 0 in 0.018s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[32cf7af0-7d2c-49a2-837f-23cf03ecc2b2]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] iscsiadm ('--op', 'update', '-n', 'node.session.auth.authmethod', '-v', 'CHAP'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9 -p 172.16.0.220:3260 --op update -n node.session.auth.username -v rAiDNkvz {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9 -p 172.16.0.220:3260 --op update -n node.session.auth.username -v rAiDNkvz" returned: 0 in 0.023s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[ac6fc131-066b-4fde-a085-b55d89fce261]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] iscsiadm ('--op', 'update', '-n', 'node.session.auth.username', '-v', 'rAiDNkvz'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9 -p 172.16.0.220:3260 --op update -n node.session.auth.password -v *** {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9 -p 172.16.0.220:3260 --op update -n node.session.auth.password -v ***" returned: 0 in 0.020s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[080db09f-e4cf-4016-a761-a5c3367275ce]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] iscsiadm ('--op', 'update', '-n', 'node.session.auth.password', '-v', '***'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m session {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m session" returned: 0 in 0.022s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[d50920b6-911a-4ef4-9371-a80e0ed0350a]: (4, ('tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash)\ntcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash)\ntcp: [6] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac (non-flash)\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] iscsiadm ('-m', 'session'): stdout=tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash) Jul 27 09:32:56 user nova-compute[70374]: tcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash) Jul 27 09:32:56 user nova-compute[70374]: tcp: [6] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac (non-flash) Jul 27 09:32:56 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm_bare /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1182}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] iscsi session list stdout=tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash) Jul 27 09:32:56 user nova-compute[70374]: tcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash) Jul 27 09:32:56 user nova-compute[70374]: tcp: [6] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac (non-flash) Jul 27 09:32:56 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsi_session /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1171}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9 -p 172.16.0.220:3260 --login {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9 -p 172.16.0.220:3260 --login" returned: 0 in 0.061s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[7b51b80e-6e3c-4db2-a8a1-f0a599ce03bf]: (4, ('Logging in to [iface: default, target: iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9, portal: 172.16.0.220,3260]\nLogin to [iface: default, target: iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9, portal: 172.16.0.220,3260] successful.\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] iscsiadm ('--login',): stdout=Logging in to [iface: default, target: iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9, portal: 172.16.0.220,3260] Jul 27 09:32:56 user nova-compute[70374]: Login to [iface: default, target: iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9, portal: 172.16.0.220,3260] successful. Jul 27 09:32:56 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9 -p 172.16.0.220:3260 --op update -n node.startup -v automatic {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9 -p 172.16.0.220:3260 --op update -n node.startup -v automatic" returned: 0 in 0.018s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[d8408be6-60ac-4312-a789-a88ce2cc09a7]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] iscsiadm ('--op', 'update', '-n', 'node.startup', '-v', 'automatic'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m session {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m session" returned: 0 in 0.022s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[9ad58e7a-4d61-45e6-bf57-d9b87e88d452]: (4, ('tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash)\ntcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash)\ntcp: [6] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac (non-flash)\ntcp: [8] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9 (non-flash)\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] iscsiadm ('-m', 'session'): stdout=tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash) Jul 27 09:32:56 user nova-compute[70374]: tcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash) Jul 27 09:32:56 user nova-compute[70374]: tcp: [6] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac (non-flash) Jul 27 09:32:56 user nova-compute[70374]: tcp: [8] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9 (non-flash) Jul 27 09:32:56 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm_bare /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1182}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] iscsi session list stdout=tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash) Jul 27 09:32:56 user nova-compute[70374]: tcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash) Jul 27 09:32:56 user nova-compute[70374]: tcp: [6] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac (non-flash) Jul 27 09:32:56 user nova-compute[70374]: tcp: [8] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9 (non-flash) Jul 27 09:32:56 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsi_session /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1171}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Lock "connect_to_iscsi_portal-172.16.0.220:3260-iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9" "released" by "os_brick.initiator.connectors.iscsi.ISCSIConnector._connect_to_iscsi_portal_unsafe" :: held 0.278s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:87}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Connected to 172.16.0.220:3260 {{(pid=70374) _connect_vol /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:633}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] HCTL ('36', '-', '-', 0) found on session 8 with lun 0 {{(pid=70374) get_hctl /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:694}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Scanning host 36 c: -, t: -, l: 0) {{(pid=70374) scan_iscsi /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:719}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): tee -a /sys/class/scsi_host/host36/scan {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "tee -a /sys/class/scsi_host/host36/scan" returned: 0 in 0.016s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[755f2c25-2e74-420f-90f4-fe5ec59fff2e]: (4, ('- - 0', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Searching for a device in session 8 and hctl ['36', '*', '*', 0] yield: sde {{(pid=70374) device_name_by_hctl /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:713}} Jul 27 09:32:56 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Connected to sde using {'target_iqn': 'iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9', 'target_portal': '172.16.0.220:3260', 'target_discovered': False, 'auth_method': 'CHAP', 'auth_username': 'rAiDNkvz', 'auth_password': '***', 'target_lun': 0, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False} {{(pid=70374) _connect_vol /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:661}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Lock "connect_volume" "released" by "os_brick.initiator.connectors.iscsi.ISCSIConnector.connect_volume" :: held 1.313s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:87}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] <== connect_volume: return (1315ms) {'type': 'block', 'scsi_wwn': '23831346331333234', 'path': '/dev/sde'} {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:202}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG nova.virt.libvirt.volume.iscsi [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Attached iSCSI volume {'type': 'block', 'scsi_wwn': '23831346331333234', 'path': '/dev/sde'} {{(pid=70374) connect_volume /opt/stack/nova/nova/virt/libvirt/volume/iscsi.py:65}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG nova.objects.instance [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Lazy-loading 'flavor' on Instance uuid b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG nova.virt.libvirt.guest [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] attach device xml: Jul 27 09:32:57 user nova-compute[70374]: Jul 27 09:32:57 user nova-compute[70374]: Jul 27 09:32:57 user nova-compute[70374]: Jul 27 09:32:57 user nova-compute[70374]: 1bee565c-934c-49bb-9a23-cafd79caf6c9 Jul 27 09:32:57 user nova-compute[70374]: Jul 27 09:32:57 user nova-compute[70374]: {{(pid=70374) attach_device /opt/stack/nova/nova/virt/libvirt/guest.py:339}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG nova.objects.instance [None req-f0d0dbe6-77e0-4072-b67b-27bb8a0bcf99 tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Lazy-loading 'flavor' on Instance uuid 309c9c26-4a0f-45db-bb3a-595b19f3f627 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-f0d0dbe6-77e0-4072-b67b-27bb8a0bcf99 tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Lock "309c9c26-4a0f-45db-bb3a-595b19f3f627" "released" by "nova.compute.manager.ComputeManager.detach_volume..do_detach_volume" :: held 6.713s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] No BDM found with device name vda, not building metadata. {{(pid=70374) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] No BDM found with device name hda, not building metadata. {{(pid=70374) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12092}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] No VIF found with MAC fa:16:3e:b2:9d:c3, not building metadata {{(pid=70374) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12068}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG nova.compute.manager [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] swap_volume: Calling driver volume swap with connection infos: new: {'driver_volume_type': 'iscsi', 'data': {'target_iqn': 'iqn.2010-10.org.openstack:4f6a5d8f-095f-487e-b752-c9b1ce301b09', 'target_portal': '172.16.0.220:3260', 'target_discovered': False, 'auth_method': 'CHAP', 'auth_username': 'MhDngcBP', 'auth_password': '***', 'target_lun': 0, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'd35fe056-8279-479a-a673-6c61e5ec6933', 'attached_at': '', 'detached_at': '', 'volume_id': '4f6a5d8f-095f-487e-b752-c9b1ce301b09', 'serial': '4f6a5d8f-095f-487e-b752-c9b1ce301b09'}; old: {'driver_volume_type': 'iscsi', 'data': {'target_iqn': 'iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac', 'target_portal': '172.16.0.220:3260', 'target_discovered': False, 'auth_method': 'CHAP', 'auth_username': 'KUpudWeL', 'auth_password': '***', 'target_lun': 0, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False, 'device_path': '/dev/sdd'}, 'status': 'reserved', 'instance': 'd35fe056-8279-479a-a673-6c61e5ec6933', 'attached_at': '', 'detached_at': '', 'volume_id': '25fa94b4-c6dc-4dbd-add9-e3a58523baac', 'serial'} {{(pid=70374) _swap_volume /opt/stack/nova/nova/compute/manager.py:7702}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG nova.virt.libvirt.volume.iscsi [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Calling os-brick to attach iSCSI Volume {{(pid=70374) connect_volume /opt/stack/nova/nova/virt/libvirt/volume/iscsi.py:63}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] ==> connect_volume: call "{'self': , 'connection_properties': {'target_iqn': 'iqn.2010-10.org.openstack:4f6a5d8f-095f-487e-b752-c9b1ce301b09', 'target_portal': '172.16.0.220:3260', 'target_discovered': False, 'auth_method': 'CHAP', 'auth_username': 'MhDngcBP', 'auth_password': '***', 'target_lun': 0, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False}}" {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:175}} Jul 27 09:32:57 user nova-compute[70374]: WARNING os_brick.initiator.connectors.base [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] Service needs to call os_brick.setup() before connecting volumes, if it doesn't it will break on the next release Jul 27 09:32:57 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] Acquiring lock "connect_volume" by "os_brick.initiator.connectors.iscsi.ISCSIConnector.connect_volume" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:68}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] Lock "connect_volume" acquired by "os_brick.initiator.connectors.iscsi.ISCSIConnector.connect_volume" :: waited 0.002s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:73}} Jul 27 09:32:57 user nova-compute[70374]: WARNING os_brick.initiator.connectors.base [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] Service needs to call os_brick.setup() before connecting volumes, if it doesn't it will break on the next release Jul 27 09:32:57 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] Acquiring lock "connect_to_iscsi_portal-172.16.0.220:3260-iqn.2010-10.org.openstack:4f6a5d8f-095f-487e-b752-c9b1ce301b09" by "os_brick.initiator.connectors.iscsi.ISCSIConnector._connect_to_iscsi_portal_unsafe" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:68}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] Lock "connect_to_iscsi_portal-172.16.0.220:3260-iqn.2010-10.org.openstack:4f6a5d8f-095f-487e-b752-c9b1ce301b09" acquired by "os_brick.initiator.connectors.iscsi.ISCSIConnector._connect_to_iscsi_portal_unsafe" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:73}} Jul 27 09:32:57 user nova-compute[70374]: INFO os_brick.initiator.connectors.iscsi [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] Trying to connect to iSCSI portal 172.16.0.220:3260 Jul 27 09:32:57 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:4f6a5d8f-095f-487e-b752-c9b1ce301b09 -p 172.16.0.220:3260 {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:4f6a5d8f-095f-487e-b752-c9b1ce301b09 -p 172.16.0.220:3260" returned: 21 in 0.019s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[6b71c310-a827-4c9b-ae77-92337fbadbfd]: (4, ('', 'iscsiadm: No records found\n')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] iscsiadm (): stdout= stderr=iscsiadm: No records found Jul 27 09:32:57 user nova-compute[70374]: {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:4f6a5d8f-095f-487e-b752-c9b1ce301b09 -p 172.16.0.220:3260 --interface default --op new {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:4f6a5d8f-095f-487e-b752-c9b1ce301b09 -p 172.16.0.220:3260 --interface default --op new" returned: 0 in 0.018s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[7674f8f2-0c44-4a96-97e0-94ca1e70b230]: (4, ('New iSCSI node [tcp:[hw=,ip=,net_if=,iscsi_if=default] 172.16.0.220,3260,-1 iqn.2010-10.org.openstack:4f6a5d8f-095f-487e-b752-c9b1ce301b09] added\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] iscsiadm ('--interface', 'default', '--op', 'new'): stdout=New iSCSI node [tcp:[hw=,ip=,net_if=,iscsi_if=default] 172.16.0.220,3260,-1 iqn.2010-10.org.openstack:4f6a5d8f-095f-487e-b752-c9b1ce301b09] added Jul 27 09:32:57 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:4f6a5d8f-095f-487e-b752-c9b1ce301b09 -p 172.16.0.220:3260 --op update -n node.session.scan -v manual {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:4f6a5d8f-095f-487e-b752-c9b1ce301b09 -p 172.16.0.220:3260 --op update -n node.session.scan -v manual" returned: 0 in 0.021s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[426c9b63-642f-439c-bc73-13636a26d129]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] iscsiadm ('--op', 'update', '-n', 'node.session.scan', '-v', 'manual'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:4f6a5d8f-095f-487e-b752-c9b1ce301b09 -p 172.16.0.220:3260 --op update -n node.session.auth.authmethod -v CHAP {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:4f6a5d8f-095f-487e-b752-c9b1ce301b09 -p 172.16.0.220:3260 --op update -n node.session.auth.authmethod -v CHAP" returned: 0 in 0.023s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[75ea38ab-747f-440e-a444-c0c0965a780d]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] iscsiadm ('--op', 'update', '-n', 'node.session.auth.authmethod', '-v', 'CHAP'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:4f6a5d8f-095f-487e-b752-c9b1ce301b09 -p 172.16.0.220:3260 --op update -n node.session.auth.username -v MhDngcBP {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-9377f114-9a4e-470e-9e9f-9a7b689b241d tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Lock "b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06" "released" by "nova.compute.manager.ComputeManager.attach_volume..do_attach_volume" :: held 9.190s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:4f6a5d8f-095f-487e-b752-c9b1ce301b09 -p 172.16.0.220:3260 --op update -n node.session.auth.username -v MhDngcBP" returned: 0 in 0.021s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[bdadc357-b23c-44cd-a731-b9525d2b591f]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] iscsiadm ('--op', 'update', '-n', 'node.session.auth.username', '-v', 'MhDngcBP'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:4f6a5d8f-095f-487e-b752-c9b1ce301b09 -p 172.16.0.220:3260 --op update -n node.session.auth.password -v *** {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:4f6a5d8f-095f-487e-b752-c9b1ce301b09 -p 172.16.0.220:3260 --op update -n node.session.auth.password -v ***" returned: 0 in 0.019s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[b4706874-d952-4e7a-bda5-e9717dc17754]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] iscsiadm ('--op', 'update', '-n', 'node.session.auth.password', '-v', '***'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m session {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m session" returned: 0 in 0.022s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[32bbca06-14e1-4ed9-a093-1984f3797a54]: (4, ('tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash)\ntcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash)\ntcp: [6] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac (non-flash)\ntcp: [8] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9 (non-flash)\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] iscsiadm ('-m', 'session'): stdout=tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash) Jul 27 09:32:57 user nova-compute[70374]: tcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash) Jul 27 09:32:57 user nova-compute[70374]: tcp: [6] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac (non-flash) Jul 27 09:32:57 user nova-compute[70374]: tcp: [8] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9 (non-flash) Jul 27 09:32:57 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm_bare /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1182}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] iscsi session list stdout=tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash) Jul 27 09:32:57 user nova-compute[70374]: tcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash) Jul 27 09:32:57 user nova-compute[70374]: tcp: [6] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac (non-flash) Jul 27 09:32:57 user nova-compute[70374]: tcp: [8] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9 (non-flash) Jul 27 09:32:57 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsi_session /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1171}} Jul 27 09:32:57 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:4f6a5d8f-095f-487e-b752-c9b1ce301b09 -p 172.16.0.220:3260 --login {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:4f6a5d8f-095f-487e-b752-c9b1ce301b09 -p 172.16.0.220:3260 --login" returned: 0 in 0.065s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[f8377182-32f9-485a-a4cc-3d28d55fd23e]: (4, ('Logging in to [iface: default, target: iqn.2010-10.org.openstack:4f6a5d8f-095f-487e-b752-c9b1ce301b09, portal: 172.16.0.220,3260]\nLogin to [iface: default, target: iqn.2010-10.org.openstack:4f6a5d8f-095f-487e-b752-c9b1ce301b09, portal: 172.16.0.220,3260] successful.\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] iscsiadm ('--login',): stdout=Logging in to [iface: default, target: iqn.2010-10.org.openstack:4f6a5d8f-095f-487e-b752-c9b1ce301b09, portal: 172.16.0.220,3260] Jul 27 09:32:58 user nova-compute[70374]: Login to [iface: default, target: iqn.2010-10.org.openstack:4f6a5d8f-095f-487e-b752-c9b1ce301b09, portal: 172.16.0.220,3260] successful. Jul 27 09:32:58 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:4f6a5d8f-095f-487e-b752-c9b1ce301b09 -p 172.16.0.220:3260 --op update -n node.startup -v automatic {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:4f6a5d8f-095f-487e-b752-c9b1ce301b09 -p 172.16.0.220:3260 --op update -n node.startup -v automatic" returned: 0 in 0.018s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[a325896f-2c60-4785-82fd-fa082aa739ac]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] iscsiadm ('--op', 'update', '-n', 'node.startup', '-v', 'automatic'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m session {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m session" returned: 0 in 0.025s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[fb121a27-abe1-4a30-b3a4-bdbe5b50b71e]: (4, ('tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash)\ntcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash)\ntcp: [6] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac (non-flash)\ntcp: [8] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9 (non-flash)\ntcp: [9] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:4f6a5d8f-095f-487e-b752-c9b1ce301b09 (non-flash)\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] iscsiadm ('-m', 'session'): stdout=tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash) Jul 27 09:32:58 user nova-compute[70374]: tcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash) Jul 27 09:32:58 user nova-compute[70374]: tcp: [6] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac (non-flash) Jul 27 09:32:58 user nova-compute[70374]: tcp: [8] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9 (non-flash) Jul 27 09:32:58 user nova-compute[70374]: tcp: [9] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:4f6a5d8f-095f-487e-b752-c9b1ce301b09 (non-flash) Jul 27 09:32:58 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm_bare /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1182}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] iscsi session list stdout=tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash) Jul 27 09:32:58 user nova-compute[70374]: tcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash) Jul 27 09:32:58 user nova-compute[70374]: tcp: [6] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac (non-flash) Jul 27 09:32:58 user nova-compute[70374]: tcp: [8] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9 (non-flash) Jul 27 09:32:58 user nova-compute[70374]: tcp: [9] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:4f6a5d8f-095f-487e-b752-c9b1ce301b09 (non-flash) Jul 27 09:32:58 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsi_session /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1171}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] Lock "connect_to_iscsi_portal-172.16.0.220:3260-iqn.2010-10.org.openstack:4f6a5d8f-095f-487e-b752-c9b1ce301b09" "released" by "os_brick.initiator.connectors.iscsi.ISCSIConnector._connect_to_iscsi_portal_unsafe" :: held 0.292s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:87}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] Connected to 172.16.0.220:3260 {{(pid=70374) _connect_vol /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:633}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] HCTL ('37', '-', '-', 0) found on session 9 with lun 0 {{(pid=70374) get_hctl /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:694}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] Scanning host 37 c: -, t: -, l: 0) {{(pid=70374) scan_iscsi /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:719}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): tee -a /sys/class/scsi_host/host37/scan {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "tee -a /sys/class/scsi_host/host37/scan" returned: 0 in 0.015s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[ae0d0464-8dae-44f9-9e8e-818c62f2a42d]: (4, ('- - 0', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] Searching for a device in session 9 and hctl ['37', '*', '*', 0] yield: sdf {{(pid=70374) device_name_by_hctl /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:713}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] Connected to sdf using {'target_iqn': 'iqn.2010-10.org.openstack:4f6a5d8f-095f-487e-b752-c9b1ce301b09', 'target_portal': '172.16.0.220:3260', 'target_discovered': False, 'auth_method': 'CHAP', 'auth_username': 'MhDngcBP', 'auth_password': '***', 'target_lun': 0, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False} {{(pid=70374) _connect_vol /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:661}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-72d6585d-6a93-4031-938e-d00cbeffa4da tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Acquiring lock "309c9c26-4a0f-45db-bb3a-595b19f3f627" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-72d6585d-6a93-4031-938e-d00cbeffa4da tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Lock "309c9c26-4a0f-45db-bb3a-595b19f3f627" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-72d6585d-6a93-4031-938e-d00cbeffa4da tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Acquiring lock "309c9c26-4a0f-45db-bb3a-595b19f3f627-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-72d6585d-6a93-4031-938e-d00cbeffa4da tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Lock "309c9c26-4a0f-45db-bb3a-595b19f3f627-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-72d6585d-6a93-4031-938e-d00cbeffa4da tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Lock "309c9c26-4a0f-45db-bb3a-595b19f3f627-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:58 user nova-compute[70374]: INFO nova.compute.manager [None req-72d6585d-6a93-4031-938e-d00cbeffa4da tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Terminating instance Jul 27 09:32:58 user nova-compute[70374]: DEBUG nova.compute.manager [None req-72d6585d-6a93-4031-938e-d00cbeffa4da tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Start destroying the instance on the hypervisor. {{(pid=70374) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-2a3dcb71-5936-424f-ac97-2fef09603f73 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Acquiring lock "b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06" by "nova.compute.manager.ComputeManager.detach_volume..do_detach_volume" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-2a3dcb71-5936-424f-ac97-2fef09603f73 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Lock "b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06" acquired by "nova.compute.manager.ComputeManager.detach_volume..do_detach_volume" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG nova.compute.manager [req-65b4a896-1f60-4151-a343-c6d76ed56db0 req-efc0e918-a52d-4d6e-abf4-874007a81809 service nova] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Received event network-vif-unplugged-626b287e-22ee-49d7-8ec3-c1a0660bd5d8 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-65b4a896-1f60-4151-a343-c6d76ed56db0 req-efc0e918-a52d-4d6e-abf4-874007a81809 service nova] Acquiring lock "309c9c26-4a0f-45db-bb3a-595b19f3f627-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-65b4a896-1f60-4151-a343-c6d76ed56db0 req-efc0e918-a52d-4d6e-abf4-874007a81809 service nova] Lock "309c9c26-4a0f-45db-bb3a-595b19f3f627-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-65b4a896-1f60-4151-a343-c6d76ed56db0 req-efc0e918-a52d-4d6e-abf4-874007a81809 service nova] Lock "309c9c26-4a0f-45db-bb3a-595b19f3f627-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG nova.compute.manager [req-65b4a896-1f60-4151-a343-c6d76ed56db0 req-efc0e918-a52d-4d6e-abf4-874007a81809 service nova] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] No waiting events found dispatching network-vif-unplugged-626b287e-22ee-49d7-8ec3-c1a0660bd5d8 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:32:58 user nova-compute[70374]: DEBUG nova.compute.manager [req-65b4a896-1f60-4151-a343-c6d76ed56db0 req-efc0e918-a52d-4d6e-abf4-874007a81809 service nova] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Received event network-vif-unplugged-626b287e-22ee-49d7-8ec3-c1a0660bd5d8 for instance with task_state deleting. {{(pid=70374) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10810}} Jul 27 09:32:58 user nova-compute[70374]: INFO nova.compute.manager [None req-2a3dcb71-5936-424f-ac97-2fef09603f73 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Detaching volume 1bee565c-934c-49bb-9a23-cafd79caf6c9 Jul 27 09:32:59 user nova-compute[70374]: INFO nova.virt.block_device [None req-2a3dcb71-5936-424f-ac97-2fef09603f73 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Attempting to driver detach volume 1bee565c-934c-49bb-9a23-cafd79caf6c9 from mountpoint /dev/vdb Jul 27 09:32:59 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-2a3dcb71-5936-424f-ac97-2fef09603f73 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Attempting to detach device vdb from instance b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06 from the persistent domain config. {{(pid=70374) _detach_from_persistent /opt/stack/nova/nova/virt/libvirt/driver.py:2477}} Jul 27 09:32:59 user nova-compute[70374]: DEBUG nova.virt.libvirt.guest [None req-2a3dcb71-5936-424f-ac97-2fef09603f73 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] detach device xml: Jul 27 09:32:59 user nova-compute[70374]: Jul 27 09:32:59 user nova-compute[70374]: Jul 27 09:32:59 user nova-compute[70374]: Jul 27 09:32:59 user nova-compute[70374]: 1bee565c-934c-49bb-9a23-cafd79caf6c9 Jul 27 09:32:59 user nova-compute[70374]:
Jul 27 09:32:59 user nova-compute[70374]: Jul 27 09:32:59 user nova-compute[70374]: {{(pid=70374) detach_device /opt/stack/nova/nova/virt/libvirt/guest.py:465}} Jul 27 09:32:59 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-2a3dcb71-5936-424f-ac97-2fef09603f73 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Successfully detached device vdb from instance b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06 from the persistent domain config. Jul 27 09:32:59 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-2a3dcb71-5936-424f-ac97-2fef09603f73 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06 from the live domain config. {{(pid=70374) _detach_from_live_with_retry /opt/stack/nova/nova/virt/libvirt/driver.py:2513}} Jul 27 09:32:59 user nova-compute[70374]: DEBUG nova.virt.libvirt.guest [None req-2a3dcb71-5936-424f-ac97-2fef09603f73 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] detach device xml: Jul 27 09:32:59 user nova-compute[70374]: Jul 27 09:32:59 user nova-compute[70374]: Jul 27 09:32:59 user nova-compute[70374]: Jul 27 09:32:59 user nova-compute[70374]: 1bee565c-934c-49bb-9a23-cafd79caf6c9 Jul 27 09:32:59 user nova-compute[70374]:
Jul 27 09:32:59 user nova-compute[70374]: Jul 27 09:32:59 user nova-compute[70374]: {{(pid=70374) detach_device /opt/stack/nova/nova/virt/libvirt/guest.py:465}} Jul 27 09:32:59 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:59 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] Lock "connect_volume" "released" by "os_brick.initiator.connectors.iscsi.ISCSIConnector.connect_volume" :: held 1.322s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:87}} Jul 27 09:32:59 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] <== connect_volume: return (1324ms) {'type': 'block', 'scsi_wwn': '23539623031313534', 'path': '/dev/sdf'} {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:202}} Jul 27 09:32:59 user nova-compute[70374]: DEBUG nova.virt.libvirt.volume.iscsi [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] [instance: d35fe056-8279-479a-a673-6c61e5ec6933] Attached iSCSI volume {'type': 'block', 'scsi_wwn': '23539623031313534', 'path': '/dev/sdf'} {{(pid=70374) connect_volume /opt/stack/nova/nova/virt/libvirt/volume/iscsi.py:65}} Jul 27 09:32:59 user nova-compute[70374]: DEBUG nova.objects.instance [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] Lazy-loading 'flavor' on Instance uuid d35fe056-8279-479a-a673-6c61e5ec6933 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:32:59 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:59 user nova-compute[70374]: INFO nova.virt.libvirt.driver [-] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Instance destroyed successfully. Jul 27 09:32:59 user nova-compute[70374]: DEBUG nova.objects.instance [None req-72d6585d-6a93-4031-938e-d00cbeffa4da tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Lazy-loading 'resources' on Instance uuid 309c9c26-4a0f-45db-bb3a-595b19f3f627 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:32:59 user nova-compute[70374]: DEBUG nova.virt.libvirt.vif [None req-72d6585d-6a93-4031-938e-d00cbeffa4da tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2023-07-27T09:30:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-860974169',display_name='tempest-AttachSCSIVolumeTestJSON-server-860974169',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(6),hidden=False,host='user',hostname='tempest-attachscsivolumetestjson-server-860974169',id=8,image_ref='726e1210-3ce3-486f-9417-95adaf9ac235',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK2UxiY9xja9j2e+bYNFRInNFJS7DjB/7i9A7rll9a0HowWQT/92Qp4K1SO60+YclaZAVgDJD7MloBXQCsn+BhUVizXOve/ao95EzmtZUBSUTgLTVXF8O7s8qW7tjuxGXQ==',key_name='tempest-keypair-936823846',keypairs=,launch_index=0,launched_at=2023-07-27T09:30:59Z,launched_on='user',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='cdd638e5400740279443a374e3e570d4',ramdisk_id='',reservation_id='r-010r1p2b',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='726e1210-3ce3-486f-9417-95adaf9ac235',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_scsi_model='virtio-scsi',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachSCSIVolumeTestJSON-284147988',owner_user_name='tempest-AttachSCSIVolumeTestJSON-284147988-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-07-27T09:30:59Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='103aa251c26c4987814bc5973d86e601',uuid=309c9c26-4a0f-45db-bb3a-595b19f3f627,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "626b287e-22ee-49d7-8ec3-c1a0660bd5d8", "address": "fa:16:3e:57:6c:c6", "network": {"id": "f6f1459d-22db-4bf7-8b7b-d6dec4fa18d5", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-2084493807-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.49", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cdd638e5400740279443a374e3e570d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap626b287e-22", "ovs_interfaceid": "626b287e-22ee-49d7-8ec3-c1a0660bd5d8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Jul 27 09:32:59 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-72d6585d-6a93-4031-938e-d00cbeffa4da tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Converting VIF {"id": "626b287e-22ee-49d7-8ec3-c1a0660bd5d8", "address": "fa:16:3e:57:6c:c6", "network": {"id": "f6f1459d-22db-4bf7-8b7b-d6dec4fa18d5", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-2084493807-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.49", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cdd638e5400740279443a374e3e570d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap626b287e-22", "ovs_interfaceid": "626b287e-22ee-49d7-8ec3-c1a0660bd5d8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Jul 27 09:32:59 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-72d6585d-6a93-4031-938e-d00cbeffa4da tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:57:6c:c6,bridge_name='br-int',has_traffic_filtering=True,id=626b287e-22ee-49d7-8ec3-c1a0660bd5d8,network=Network(f6f1459d-22db-4bf7-8b7b-d6dec4fa18d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap626b287e-22') {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Jul 27 09:32:59 user nova-compute[70374]: DEBUG os_vif [None req-72d6585d-6a93-4031-938e-d00cbeffa4da tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:57:6c:c6,bridge_name='br-int',has_traffic_filtering=True,id=626b287e-22ee-49d7-8ec3-c1a0660bd5d8,network=Network(f6f1459d-22db-4bf7-8b7b-d6dec4fa18d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap626b287e-22') {{(pid=70374) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Jul 27 09:32:59 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:59 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap626b287e-22, bridge=br-int, if_exists=True) {{(pid=70374) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Jul 27 09:32:59 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:59 user nova-compute[70374]: DEBUG nova.virt.libvirt.guest [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] COPY block job progress, current cursor: 9437184 final cursor: 1073741824 {{(pid=70374) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Jul 27 09:32:59 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Jul 27 09:32:59 user nova-compute[70374]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=70374) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Jul 27 09:32:59 user nova-compute[70374]: INFO os_vif [None req-72d6585d-6a93-4031-938e-d00cbeffa4da tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:57:6c:c6,bridge_name='br-int',has_traffic_filtering=True,id=626b287e-22ee-49d7-8ec3-c1a0660bd5d8,network=Network(f6f1459d-22db-4bf7-8b7b-d6dec4fa18d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap626b287e-22') Jul 27 09:32:59 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-72d6585d-6a93-4031-938e-d00cbeffa4da tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Deleting instance files /opt/stack/data/nova/instances/309c9c26-4a0f-45db-bb3a-595b19f3f627_del Jul 27 09:32:59 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-72d6585d-6a93-4031-938e-d00cbeffa4da tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Deletion of /opt/stack/data/nova/instances/309c9c26-4a0f-45db-bb3a-595b19f3f627_del complete Jul 27 09:32:59 user nova-compute[70374]: INFO nova.compute.manager [None req-72d6585d-6a93-4031-938e-d00cbeffa4da tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Took 1.21 seconds to destroy the instance on the hypervisor. Jul 27 09:32:59 user nova-compute[70374]: DEBUG oslo.service.loopingcall [None req-72d6585d-6a93-4031-938e-d00cbeffa4da tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=70374) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Jul 27 09:32:59 user nova-compute[70374]: DEBUG nova.compute.manager [-] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Deallocating network for instance {{(pid=70374) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} Jul 27 09:32:59 user nova-compute[70374]: DEBUG nova.network.neutron [-] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] deallocate_for_instance() {{(pid=70374) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1795}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-4d7e72c1-4e89-4f44-806a-b66f03d5d874 None None] Received event virtio-disk1> from libvirt while the driver is waiting for it; dispatched. {{(pid=70374) emit_event /opt/stack/nova/nova/virt/libvirt/driver.py:2360}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-2a3dcb71-5936-424f-ac97-2fef09603f73 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06 {{(pid=70374) _detach_from_live_and_wait_for_event /opt/stack/nova/nova/virt/libvirt/driver.py:2589}} Jul 27 09:33:00 user nova-compute[70374]: INFO nova.virt.libvirt.driver [None req-2a3dcb71-5936-424f-ac97-2fef09603f73 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Successfully detached device vdb from instance b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06 from the live domain config. Jul 27 09:33:00 user nova-compute[70374]: DEBUG nova.virt.libvirt.volume.iscsi [None req-2a3dcb71-5936-424f-ac97-2fef09603f73 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] calling os-brick to detach iSCSI Volume {{(pid=70374) disconnect_volume /opt/stack/nova/nova/virt/libvirt/volume/iscsi.py:72}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-2a3dcb71-5936-424f-ac97-2fef09603f73 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] ==> disconnect_volume: call "{'args': (, {'target_iqn': 'iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9', 'target_portal': '172.16.0.220:3260', 'target_discovered': False, 'auth_method': 'CHAP', 'auth_username': 'rAiDNkvz', 'auth_password': '***', 'target_lun': 0, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False, 'device_path': '/dev/sde'}, None), 'kwargs': False}}" {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:175}} Jul 27 09:33:00 user nova-compute[70374]: WARNING os_brick.initiator.connectors.base [None req-2a3dcb71-5936-424f-ac97-2fef09603f73 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Service needs to call os_brick.setup() before connecting volumes, if it doesn't it will break on the next release Jul 27 09:33:00 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-2a3dcb71-5936-424f-ac97-2fef09603f73 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Acquiring lock "connect_volume" by "os_brick.initiator.connectors.iscsi.ISCSIConnector.disconnect_volume" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:68}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-2a3dcb71-5936-424f-ac97-2fef09603f73 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Lock "connect_volume" acquired by "os_brick.initiator.connectors.iscsi.ISCSIConnector.disconnect_volume" :: waited 0.002s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:73}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m discoverydb -o show -P 1 {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m discoverydb -o show -P 1" returned: 0 in 0.033s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[030aa6bb-a9a5-4d85-9012-2412e3e31162]: (4, ('SENDTARGETS:\nNo targets found.\niSNS:\nNo targets found.\nSTATIC:\nTarget: iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9\n\tPortal: 172.16.0.220:3260,-1\n\t\tIface Name: default\nTarget: iqn.2010-10.org.openstack:4f6a5d8f-095f-487e-b752-c9b1ce301b09\n\tPortal: 172.16.0.220:3260,-1\n\t\tIface Name: default\nTarget: iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291\n\tPortal: 172.16.0.220:3260,-1\n\t\tIface Name: default\nTarget: iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251\n\tPortal: 172.16.0.220:3260,-1\n\t\tIface Name: default\nTarget: iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac\n\tPortal: 172.16.0.220:3260,-1\n\t\tIface Name: default\nFIRMWARE:\nNo targets found.\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-2a3dcb71-5936-424f-ac97-2fef09603f73 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] iscsiadm ['-m', 'discoverydb', '-o', 'show', '-P', 1]: stdout=SENDTARGETS: Jul 27 09:33:00 user nova-compute[70374]: No targets found. Jul 27 09:33:00 user nova-compute[70374]: iSNS: Jul 27 09:33:00 user nova-compute[70374]: No targets found. Jul 27 09:33:00 user nova-compute[70374]: STATIC: Jul 27 09:33:00 user nova-compute[70374]: Target: iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9 Jul 27 09:33:00 user nova-compute[70374]: Portal: 172.16.0.220:3260,-1 Jul 27 09:33:00 user nova-compute[70374]: Iface Name: default Jul 27 09:33:00 user nova-compute[70374]: Target: iqn.2010-10.org.openstack:4f6a5d8f-095f-487e-b752-c9b1ce301b09 Jul 27 09:33:00 user nova-compute[70374]: Portal: 172.16.0.220:3260,-1 Jul 27 09:33:00 user nova-compute[70374]: Iface Name: default Jul 27 09:33:00 user nova-compute[70374]: Target: iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 Jul 27 09:33:00 user nova-compute[70374]: Portal: 172.16.0.220:3260,-1 Jul 27 09:33:00 user nova-compute[70374]: Iface Name: default Jul 27 09:33:00 user nova-compute[70374]: Target: iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 Jul 27 09:33:00 user nova-compute[70374]: Portal: 172.16.0.220:3260,-1 Jul 27 09:33:00 user nova-compute[70374]: Iface Name: default Jul 27 09:33:00 user nova-compute[70374]: Target: iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac Jul 27 09:33:00 user nova-compute[70374]: Portal: 172.16.0.220:3260,-1 Jul 27 09:33:00 user nova-compute[70374]: Iface Name: default Jul 27 09:33:00 user nova-compute[70374]: FIRMWARE: Jul 27 09:33:00 user nova-compute[70374]: No targets found. Jul 27 09:33:00 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm_bare /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1182}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-2a3dcb71-5936-424f-ac97-2fef09603f73 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Regex to get portals from discoverydb: ^SENDTARGETS: Jul 27 09:33:00 user nova-compute[70374]: .*?^DiscoveryAddress: 172.16.0.220,3260.*? Jul 27 09:33:00 user nova-compute[70374]: (.*?)^(?:DiscoveryAddress|iSNS):.* {{(pid=70374) _get_discoverydb_portals /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:371}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-2a3dcb71-5936-424f-ac97-2fef09603f73 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Getting connected devices for (ips,iqns,luns)=[('172.16.0.220:3260', 'iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9', 0)] {{(pid=70374) _get_connection_devices /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:819}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node" returned: 0 in 0.027s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[535378f2-1a09-4399-9b26-4369c8b37ee7]: (4, ('172.16.0.220:3260,4294967295 iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9\n172.16.0.220:3260,4294967295 iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac\n172.16.0.220:3260,4294967295 iqn.2010-10.org.openstack:4f6a5d8f-095f-487e-b752-c9b1ce301b09\n172.16.0.220:3260,4294967295 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291\n172.16.0.220:3260,4294967295 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m session {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m session" returned: 0 in 0.035s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[b8ea5d76-a2cd-46c2-b21f-7598e555e9c2]: (4, ('tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash)\ntcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash)\ntcp: [6] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac (non-flash)\ntcp: [8] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9 (non-flash)\ntcp: [9] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:4f6a5d8f-095f-487e-b752-c9b1ce301b09 (non-flash)\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-2a3dcb71-5936-424f-ac97-2fef09603f73 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] iscsiadm ('-m', 'session'): stdout=tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash) Jul 27 09:33:00 user nova-compute[70374]: tcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash) Jul 27 09:33:00 user nova-compute[70374]: tcp: [6] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac (non-flash) Jul 27 09:33:00 user nova-compute[70374]: tcp: [8] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9 (non-flash) Jul 27 09:33:00 user nova-compute[70374]: tcp: [9] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:4f6a5d8f-095f-487e-b752-c9b1ce301b09 (non-flash) Jul 27 09:33:00 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm_bare /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1182}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-2a3dcb71-5936-424f-ac97-2fef09603f73 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] iscsi session list stdout=tcp: [3] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:ad9cc86c-1deb-4294-bdbf-5cce1d5b6251 (non-flash) Jul 27 09:33:00 user nova-compute[70374]: tcp: [4] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:788f71f6-0aca-4da9-a915-59560f5cb291 (non-flash) Jul 27 09:33:00 user nova-compute[70374]: tcp: [6] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:25fa94b4-c6dc-4dbd-add9-e3a58523baac (non-flash) Jul 27 09:33:00 user nova-compute[70374]: tcp: [8] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9 (non-flash) Jul 27 09:33:00 user nova-compute[70374]: tcp: [9] 172.16.0.220:3260,1 iqn.2010-10.org.openstack:4f6a5d8f-095f-487e-b752-c9b1ce301b09 (non-flash) Jul 27 09:33:00 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsi_session /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1171}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-2a3dcb71-5936-424f-ac97-2fef09603f73 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Resulting device map defaultdict(. at 0x7f27fc2dcdc0>, {('172.16.0.220:3260', 'iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9'): ({'sde'}, set())}) {{(pid=70374) _get_connection_devices /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:852}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-2a3dcb71-5936-424f-ac97-2fef09603f73 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Removing single pathed devices sde {{(pid=70374) remove_connection /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:309}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.027s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[8440377c-3451-4134-9fcd-f11fbf68816e]: (4, ('path checker states:\nup 5\n\npaths: 0\nbusy: False\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd del path /dev/sde {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "multipathd del path /dev/sde" returned: 0 in 0.026s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[dbb501a9-2534-4610-90e1-7f055c1fb464]: (4, ('ok\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-2a3dcb71-5936-424f-ac97-2fef09603f73 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Flushing IO for device /dev/sde {{(pid=70374) flush_device_io /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:369}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): blockdev --flushbufs /dev/sde {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG nova.virt.libvirt.guest [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] COPY block job progress, current cursor: 183500800 final cursor: 1073741824 {{(pid=70374) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "blockdev --flushbufs /dev/sde" returned: 0 in 0.029s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[2498e112-f1ce-4144-995d-f2bbc9f0cf54]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-2a3dcb71-5936-424f-ac97-2fef09603f73 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Remove SCSI device /dev/sde with /sys/block/sde/device/delete {{(pid=70374) remove_scsi_device /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:83}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): tee -a /sys/block/sde/device/delete {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "tee -a /sys/block/sde/device/delete" returned: 0 in 0.042s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[3b5ccfa0-6165-402a-9c54-faed4c454104]: (4, ('1', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-2a3dcb71-5936-424f-ac97-2fef09603f73 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Checking to see if SCSI volumes sde have been removed. {{(pid=70374) wait_for_volumes_removal /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:91}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG os_brick.initiator.linuxscsi [None req-2a3dcb71-5936-424f-ac97-2fef09603f73 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] SCSI volumes sde have been removed. {{(pid=70374) wait_for_volumes_removal /usr/local/lib/python3.10/dist-packages/os_brick/initiator/linuxscsi.py:101}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-2a3dcb71-5936-424f-ac97-2fef09603f73 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Disconnecting from: [('172.16.0.220:3260', 'iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9')] {{(pid=70374) _disconnect_connection /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1160}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9 -p 172.16.0.220:3260 --op update -n node.startup -v manual {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9 -p 172.16.0.220:3260 --op update -n node.startup -v manual" returned: 0 in 0.026s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[03c4dc8a-2bcc-407d-b506-a344824f2aed]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-2a3dcb71-5936-424f-ac97-2fef09603f73 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] iscsiadm ('--op', 'update', '-n', 'node.startup', '-v', 'manual'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9 -p 172.16.0.220:3260 --logout {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9 -p 172.16.0.220:3260 --logout" returned: 0 in 0.072s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[0706cc1a-f899-452a-90b5-90ad54b9dcd1]: (4, ('Logging out of session [sid: 8, target: iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9, portal: 172.16.0.220,3260]\nLogout of [sid: 8, target: iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9, portal: 172.16.0.220,3260] successful.\n', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-2a3dcb71-5936-424f-ac97-2fef09603f73 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] iscsiadm ('--logout',): stdout=Logging out of session [sid: 8, target: iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9, portal: 172.16.0.220,3260] Jul 27 09:33:00 user nova-compute[70374]: Logout of [sid: 8, target: iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9, portal: 172.16.0.220,3260] successful. Jul 27 09:33:00 user nova-compute[70374]: stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): iscsiadm -m node -T iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9 -p 172.16.0.220:3260 --op delete {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG oslo_concurrency.processutils [-] CMD "iscsiadm -m node -T iqn.2010-10.org.openstack:1bee565c-934c-49bb-9a23-cafd79caf6c9 -p 172.16.0.220:3260 --op delete" returned: 0 in 0.043s {{(pid=82164) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG oslo.privsep.daemon [-] privsep: reply[6f9533e0-ef80-415f-a485-d42540b08242]: (4, ('', '')) {{(pid=82164) _call_back /usr/local/lib/python3.10/dist-packages/oslo_privsep/daemon.py:501}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-2a3dcb71-5936-424f-ac97-2fef09603f73 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] iscsiadm ('--op', 'delete'): stdout= stderr= {{(pid=70374) _run_iscsiadm /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/iscsi.py:1026}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.base [None req-2a3dcb71-5936-424f-ac97-2fef09603f73 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Lock "connect_volume" "released" by "os_brick.initiator.connectors.iscsi.ISCSIConnector.disconnect_volume" :: held 0.439s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/os_brick/initiator/connectors/base.py:87}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG os_brick.initiator.connectors.iscsi [None req-2a3dcb71-5936-424f-ac97-2fef09603f73 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] <== disconnect_volume: return (441ms) None {{(pid=70374) trace_logging_wrapper /usr/local/lib/python3.10/dist-packages/os_brick/utils.py:202}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG nova.virt.libvirt.volume.iscsi [None req-2a3dcb71-5936-424f-ac97-2fef09603f73 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] [instance: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06] Disconnected iSCSI Volume {{(pid=70374) disconnect_volume /opt/stack/nova/nova/virt/libvirt/volume/iscsi.py:79}} Jul 27 09:33:00 user nova-compute[70374]: DEBUG nova.virt.libvirt.guest [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] COPY block job progress, current cursor: 359661568 final cursor: 1073741824 {{(pid=70374) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Jul 27 09:33:01 user nova-compute[70374]: DEBUG nova.compute.manager [req-a57ff960-4e93-4765-a7b3-58b6f787341a req-4cfdee70-4f40-4a19-a161-6023f1a8a0cf service nova] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Received event network-vif-plugged-626b287e-22ee-49d7-8ec3-c1a0660bd5d8 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:33:01 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-a57ff960-4e93-4765-a7b3-58b6f787341a req-4cfdee70-4f40-4a19-a161-6023f1a8a0cf service nova] Acquiring lock "309c9c26-4a0f-45db-bb3a-595b19f3f627-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:33:01 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-a57ff960-4e93-4765-a7b3-58b6f787341a req-4cfdee70-4f40-4a19-a161-6023f1a8a0cf service nova] Lock "309c9c26-4a0f-45db-bb3a-595b19f3f627-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:33:01 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [req-a57ff960-4e93-4765-a7b3-58b6f787341a req-4cfdee70-4f40-4a19-a161-6023f1a8a0cf service nova] Lock "309c9c26-4a0f-45db-bb3a-595b19f3f627-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:33:01 user nova-compute[70374]: DEBUG nova.compute.manager [req-a57ff960-4e93-4765-a7b3-58b6f787341a req-4cfdee70-4f40-4a19-a161-6023f1a8a0cf service nova] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] No waiting events found dispatching network-vif-plugged-626b287e-22ee-49d7-8ec3-c1a0660bd5d8 {{(pid=70374) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Jul 27 09:33:01 user nova-compute[70374]: WARNING nova.compute.manager [req-a57ff960-4e93-4765-a7b3-58b6f787341a req-4cfdee70-4f40-4a19-a161-6023f1a8a0cf service nova] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Received unexpected event network-vif-plugged-626b287e-22ee-49d7-8ec3-c1a0660bd5d8 for instance with vm_state active and task_state deleting. Jul 27 09:33:01 user nova-compute[70374]: DEBUG nova.virt.libvirt.guest [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] COPY block job progress, current cursor: 485490688 final cursor: 1073741824 {{(pid=70374) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Jul 27 09:33:01 user nova-compute[70374]: DEBUG nova.network.neutron [-] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Updating instance_info_cache with network_info: [] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:33:01 user nova-compute[70374]: INFO nova.compute.manager [-] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Took 1.60 seconds to deallocate network for instance. Jul 27 09:33:01 user nova-compute[70374]: DEBUG nova.compute.manager [req-81ce5401-5646-464e-94d1-8fbedbd9fc36 req-30dbed91-3693-49fc-86c3-52c8693cfe04 service nova] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Received event network-vif-deleted-626b287e-22ee-49d7-8ec3-c1a0660bd5d8 {{(pid=70374) external_instance_event /opt/stack/nova/nova/compute/manager.py:11032}} Jul 27 09:33:01 user nova-compute[70374]: INFO nova.compute.manager [req-81ce5401-5646-464e-94d1-8fbedbd9fc36 req-30dbed91-3693-49fc-86c3-52c8693cfe04 service nova] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Neutron deleted interface 626b287e-22ee-49d7-8ec3-c1a0660bd5d8; detaching it from the instance and deleting it from the info cache Jul 27 09:33:01 user nova-compute[70374]: DEBUG nova.network.neutron [req-81ce5401-5646-464e-94d1-8fbedbd9fc36 req-30dbed91-3693-49fc-86c3-52c8693cfe04 service nova] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Updating instance_info_cache with network_info: [] {{(pid=70374) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Jul 27 09:33:01 user nova-compute[70374]: DEBUG nova.compute.manager [req-81ce5401-5646-464e-94d1-8fbedbd9fc36 req-30dbed91-3693-49fc-86c3-52c8693cfe04 service nova] [instance: 309c9c26-4a0f-45db-bb3a-595b19f3f627] Detach interface failed, port_id=626b287e-22ee-49d7-8ec3-c1a0660bd5d8, reason: Instance 309c9c26-4a0f-45db-bb3a-595b19f3f627 could not be found. {{(pid=70374) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10866}} Jul 27 09:33:01 user nova-compute[70374]: DEBUG nova.objects.instance [None req-2a3dcb71-5936-424f-ac97-2fef09603f73 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Lazy-loading 'flavor' on Instance uuid b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:33:01 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-72d6585d-6a93-4031-938e-d00cbeffa4da tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:33:01 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-72d6585d-6a93-4031-938e-d00cbeffa4da tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:33:01 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-2a3dcb71-5936-424f-ac97-2fef09603f73 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Lock "b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06" "released" by "nova.compute.manager.ComputeManager.detach_volume..do_detach_volume" :: held 2.690s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:33:01 user nova-compute[70374]: DEBUG nova.virt.libvirt.guest [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] COPY block job progress, current cursor: 646971392 final cursor: 1073741824 {{(pid=70374) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Jul 27 09:33:02 user nova-compute[70374]: DEBUG nova.compute.provider_tree [None req-72d6585d-6a93-4031-938e-d00cbeffa4da tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Inventory has not changed in ProviderTree for provider: f7548644-4a09-4ad8-9aa6-6e05d85a9f5b {{(pid=70374) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Jul 27 09:33:02 user nova-compute[70374]: DEBUG nova.scheduler.client.report [None req-72d6585d-6a93-4031-938e-d00cbeffa4da tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Inventory has not changed for provider f7548644-4a09-4ad8-9aa6-6e05d85a9f5b based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16011, 'reserved': 512, 'min_unit': 1, 'max_unit': 16011, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=70374) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Jul 27 09:33:02 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-72d6585d-6a93-4031-938e-d00cbeffa4da tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.567s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:33:02 user nova-compute[70374]: INFO nova.scheduler.client.report [None req-72d6585d-6a93-4031-938e-d00cbeffa4da tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Deleted allocations for instance 309c9c26-4a0f-45db-bb3a-595b19f3f627 Jul 27 09:33:02 user nova-compute[70374]: DEBUG nova.virt.libvirt.guest [None req-62d0b1f3-25c5-4ce5-b0b8-f241dda8b5a4 tempest-TestVolumeSwap-1311737041 tempest-TestVolumeSwap-1311737041-project-admin] COPY block job progress, current cursor: 839909376 final cursor: 1073741824 {{(pid=70374) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Jul 27 09:33:02 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-72d6585d-6a93-4031-938e-d00cbeffa4da tempest-AttachSCSIVolumeTestJSON-284147988 tempest-AttachSCSIVolumeTestJSON-284147988-project-member] Lock "309c9c26-4a0f-45db-bb3a-595b19f3f627" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 3.679s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Jul 27 09:33:02 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-c3ac937b-bb60-485a-a5b2-386f3c4d2423 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Acquiring lock "interface-b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06-e86f785e-63db-4bd0-92b9-6b813174c7a5" by "nova.compute.manager.ComputeManager.detach_interface..do_detach_interface" {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Jul 27 09:33:02 user nova-compute[70374]: DEBUG oslo_concurrency.lockutils [None req-c3ac937b-bb60-485a-a5b2-386f3c4d2423 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Lock "interface-b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06-e86f785e-63db-4bd0-92b9-6b813174c7a5" acquired by "nova.compute.manager.ComputeManager.detach_interface..do_detach_interface" :: waited 0.001s {{(pid=70374) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Jul 27 09:33:02 user nova-compute[70374]: DEBUG nova.objects.instance [None req-c3ac937b-bb60-485a-a5b2-386f3c4d2423 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Lazy-loading 'flavor' on Instance uuid b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06 {{(pid=70374) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1139}} Jul 27 09:33:02 user nova-compute[70374]: DEBUG nova.virt.libvirt.vif [None req-c3ac937b-bb60-485a-a5b2-386f3c4d2423 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2023-07-27T09:30:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=InstanceDeviceMetadata,disable_terminate=False,display_description=None,display_name='tempest-device-tagging-server-175106609',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(6),hidden=False,host='user',hostname='tempest-device-tagging-server-175106609',id=11,image_ref='35458adf-261a-4e0b-a4db-b243619b2394',info_cache=InstanceInfoCache,instance_type_id=6,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOTnqzjorB039ilJl8VK7YcSm6BfDbMAMeMlN6KuNuE2BR5Ci7E8V9F4eHRzTVNR5ErMNNWpJQddk0yLJsVN++T9e5hUTfZ9niEsaZXI1P72KyNTxQBI7EWBQg10Ylojmg==',key_name='tempest-keypair-693359237',keypairs=,launch_index=0,launched_at=2023-07-27T09:31:03Z,launched_on='user',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='df3e52a41c1847b199e6dcd09b676fba',ramdisk_id='',reservation_id='r-tordp007',resources=,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='35458adf-261a-4e0b-a4db-b243619b2394',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.6.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-TaggedAttachmentsTest-493274998',owner_user_name='tempest-TaggedAttachmentsTest-493274998-project-member'},tags=,task_state=None,terminated_at=None,trusted_certs=,updated_at=2023-07-27T09:31:03Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6d33f8cd041046c18af25f56b63b6bb5',uuid=b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e86f785e-63db-4bd0-92b9-6b813174c7a5", "address": "fa:16:3e:f8:a9:cb", "network": {"id": "f4c65a93-da7c-42e7-9c5e-c721d1ecb59e", "bridge": "br-int", "label": "tempest-tagged-attachments-test-net-193407618", "subnets": [{"cidr": "10.10.10.0/24", "dns": [], "gateway": {"address": "10.10.10.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.10.80", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.10.10.1"}}], "meta": {"injected": false, "tenant_id": "df3e52a41c1847b199e6dcd09b676fba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape86f785e-63", "ovs_interfaceid": "e86f785e-63db-4bd0-92b9-6b813174c7a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=70374) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Jul 27 09:33:02 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-c3ac937b-bb60-485a-a5b2-386f3c4d2423 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Converting VIF {"id": "e86f785e-63db-4bd0-92b9-6b813174c7a5", "address": "fa:16:3e:f8:a9:cb", "network": {"id": "f4c65a93-da7c-42e7-9c5e-c721d1ecb59e", "bridge": "br-int", "label": "tempest-tagged-attachments-test-net-193407618", "subnets": [{"cidr": "10.10.10.0/24", "dns": [], "gateway": {"address": "10.10.10.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.10.10.80", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.10.10.1"}}], "meta": {"injected": false, "tenant_id": "df3e52a41c1847b199e6dcd09b676fba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape86f785e-63", "ovs_interfaceid": "e86f785e-63db-4bd0-92b9-6b813174c7a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Jul 27 09:33:02 user nova-compute[70374]: DEBUG nova.network.os_vif_util [None req-c3ac937b-bb60-485a-a5b2-386f3c4d2423 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f8:a9:cb,bridge_name='br-int',has_traffic_filtering=True,id=e86f785e-63db-4bd0-92b9-6b813174c7a5,network=Network(f4c65a93-da7c-42e7-9c5e-c721d1ecb59e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape86f785e-63') {{(pid=70374) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Jul 27 09:33:02 user nova-compute[70374]: DEBUG nova.virt.libvirt.guest [None req-c3ac937b-bb60-485a-a5b2-386f3c4d2423 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] looking for interface given config: {{(pid=70374) get_interface_by_cfg /opt/stack/nova/nova/virt/libvirt/guest.py:257}} Jul 27 09:33:02 user nova-compute[70374]: DEBUG nova.virt.libvirt.guest [None req-c3ac937b-bb60-485a-a5b2-386f3c4d2423 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] looking for interface given config: {{(pid=70374) get_interface_by_cfg /opt/stack/nova/nova/virt/libvirt/guest.py:257}} Jul 27 09:33:02 user nova-compute[70374]: DEBUG nova.virt.libvirt.driver [None req-c3ac937b-bb60-485a-a5b2-386f3c4d2423 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] Attempting to detach device tape86f785e-63 from instance b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06 from the persistent domain config. {{(pid=70374) _detach_from_persistent /opt/stack/nova/nova/virt/libvirt/driver.py:2477}} Jul 27 09:33:02 user nova-compute[70374]: DEBUG nova.virt.libvirt.guest [None req-c3ac937b-bb60-485a-a5b2-386f3c4d2423 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] detach device xml: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: {{(pid=70374) detach_device /opt/stack/nova/nova/virt/libvirt/guest.py:465}} Jul 27 09:33:02 user nova-compute[70374]: DEBUG nova.virt.libvirt.guest [None req-c3ac937b-bb60-485a-a5b2-386f3c4d2423 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] looking for interface given config: {{(pid=70374) get_interface_by_cfg /opt/stack/nova/nova/virt/libvirt/guest.py:257}} Jul 27 09:33:02 user nova-compute[70374]: DEBUG nova.virt.libvirt.guest [None req-c3ac937b-bb60-485a-a5b2-386f3c4d2423 tempest-TaggedAttachmentsTest-493274998 tempest-TaggedAttachmentsTest-493274998-project-member] interface for config: not found in domain: Jul 27 09:33:02 user nova-compute[70374]: instance-0000000b Jul 27 09:33:02 user nova-compute[70374]: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06 Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: tempest-device-tagging-server-175106609 Jul 27 09:33:02 user nova-compute[70374]: 2023-07-27 09:32:47 Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: 512 Jul 27 09:33:02 user nova-compute[70374]: 1 Jul 27 09:33:02 user nova-compute[70374]: 0 Jul 27 09:33:02 user nova-compute[70374]: 0 Jul 27 09:33:02 user nova-compute[70374]: 1 Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: tempest-TaggedAttachmentsTest-493274998-project-member Jul 27 09:33:02 user nova-compute[70374]: tempest-TaggedAttachmentsTest-493274998 Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: 524288 Jul 27 09:33:02 user nova-compute[70374]: 524288 Jul 27 09:33:02 user nova-compute[70374]: 1 Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: /machine Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: OpenStack Foundation Jul 27 09:33:02 user nova-compute[70374]: OpenStack Nova Jul 27 09:33:02 user nova-compute[70374]: 0.0.0 Jul 27 09:33:02 user nova-compute[70374]: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06 Jul 27 09:33:02 user nova-compute[70374]: b6e0f7ba-3fbf-4ecf-bfb4-f818302dde06 Jul 27 09:33:02 user nova-compute[70374]: Virtual Machine Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: hvm Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Nehalem Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: destroy Jul 27 09:33:02 user nova-compute[70374]: restart Jul 27 09:33:02 user nova-compute[70374]: destroy Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: /usr/bin/qemu-system-x86_64 Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]:
Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]:
Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]:
Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]:
Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]:
Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: Jul 27 09:33:02 user nova-compute[70374]: